datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
nluai/ZaloAI_ForMat | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 253977
num_examples: 677
download_size: 126853
dataset_size: 253977
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
jonathan-roberts1/SAT-6 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': barren land
'1': building
'2': grassland
'3': road
'4': trees
'5': water
splits:
- name: train
num_bytes: 120518797
num_examples: 81000
download_size: 142842069
dataset_size: 120518797
license: other
---
# Dataset Card for "SAT-6"
## Dataset Description
- **Paper** [Deepsat: a learning framework for satellite imagery](https://dl.acm.org/doi/pdf/10.1145/2820783.2820816)
- **Split** Test
### Split Information
This HuggingFace dataset repository contains just the 'Test' split.
### Licensing Information
Public Domain
## Citation Information
[https://dl.acm.org/doi/pdf/10.1145/2820783.2820816](https://dl.acm.org/doi/pdf/10.1145/2820783.2820816)
```
@inproceedings{basu2015deepsat,
title = {Deepsat: a learning framework for satellite imagery},
author = {Basu, Saikat and Ganguly, Sangram and Mukhopadhyay, Supratik and DiBiano, Robert and Karki, Manohar and Nemani, Ramakrishna},
year = 2015,
booktitle = {Proceedings of the 23rd SIGSPATIAL international conference on advances in geographic information systems},
pages = {1--10}
}
```
|
transcendingvictor/delphi-llama2-1.6m-validation-logprobs | ---
dataset_info:
features:
- name: logprobs
sequence: float64
splits:
- name: validation
num_bytes: 45818277
num_examples: 10982
download_size: 37795921
dataset_size: 45818277
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
goendalf666/sql-chat-instructions | ---
dataset_info:
features:
- name: training_input
dtype: string
splits:
- name: train
num_bytes: 20267285
num_examples: 78577
download_size: 6323963
dataset_size: 20267285
---
# Dataset Card for "sql-chat-instructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tz3r0n4r/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
techandy42/ppo-200K-collected-dataset-steps-1 | ---
dataset_info:
features:
- name: observation
sequence:
sequence:
sequence: float32
- name: action
sequence: int64
- name: reward
sequence: float32
- name: done
sequence: bool
splits:
- name: train
num_bytes: 353539
num_examples: 2324
download_size: 62825
dataset_size: 353539
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
matlok/python-copilot-training-from-many-repos-large | ---
license:
- other
pretty_name: >-
python copilot large coding dataset
dataset_info:
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-code-large-v1_00000013.parquet
size_categories:
- 100K<n<1M
- 1M<n<10M
tags:
- python-copilot
- python-coding
- fine-tuning
- training
- alpaca
- text
- coding
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-generation
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot Large Coding Dataset
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 2350782
- Size: 3.1 GB
- Data type: text
- Format: Extracted code using python AST
### Schema
```json
{
"args": "string",
"class_bases": "string",
"class_docstr": "string",
"class_docstr_tok": "string",
"class_name": "string",
"code": "string",
"code_tok": "string",
"docstr": "string",
"docstr_tok": "string",
"file_path": "string",
"filename": "string",
"imports": "string",
"is_member": "bool",
"label_desc": "string",
"label_desc_len": "int64",
"label_id": "string",
"lend": "int64",
"lstart": "int64",
"name": "string",
"num_all_bases": "float64",
"num_bases": "float64",
"num_classes": "float64",
"num_functions": "int64",
"num_imports": "int64",
"num_methods": "float64",
"raises": "string",
"returns": "string",
"total_objects": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-copilot-training-from-many-repos-large", data_dir="files")
```
|
katxtong/tokenized_squad_size384 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: train
num_bytes: 172176192
num_examples: 88568
- name: validation
num_bytes: 20975760
num_examples: 10790
download_size: 27919691
dataset_size: 193151952
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
lscpku/VITATECS | ---
license: cc-by-4.0
---
# Dataset Card for VITATECS
## Dataset Description
### Dataset Summary
VITATECS is a diagnostic VIdeo-Text dAtaset for the evaluation of TEmporal Concept underStanding.
**[2023/11/27]** We have updated a new version of VITATECS which is generated using ChatGPT. The previous version generated by OPT-175B can be found [here](https://github.com/lscpku/VITATECS/tree/main/data_opt).
### Languages
English.
## Dataset Structure
### Data Instances
This repo contains 6 jsonl files, each of which corresponds to an aspect of temporal concepts (Direction, Intensity, Sequence, Localization, Compositionality, Type).
Example (indented for better presentation):
```
{
"src_dataset": "VATEX",
"video_name": "i0ccSYMl0vo_000027_000037.mp4",
"caption": "A woman is placing a waxing strip on a man's leg.",
"counterfactual": "A woman is removing a waxing strip from a man's leg.",
"aspect": "Direction"
}
```
### Data Fields
- src_dataset: the name of the source dataset (VATEX or MSRVTT)
- video_name: the name of the video in the source dataset
- caption: the original caption of the video
- counterfactual: the generated counterfactual description of the video
### Dataset Statistics
| | Direction | Intensity | Sequence | Localization | Compositionality | Type |
| ------------------------- | --------- | --------- | -------- | ------------ | ---------------- | ----- |
| # samples | 2,709 | 745 | 380 | 1,788 | 2,393 | 8,109 |
| # videos | 2,016 | 650 | 348 | 1,453 | 1,739 | 4,856 |
| Avg. len (caption) | 13.02 | 13.04 | 15.58 | 14.37 | 13.29 | 11.34 |
| Avg. len (counterfactual) | 13.12 | 13.05 | 15.74 | 14.43 | 13.53 | 11.35 |
## Dataset Creation
### Source Data
VITATECS is based on video-text pairs from [MSR-VTT](https://www.microsoft.com/en-us/research/publication/msr-vtt-a-large-video-description-dataset-for-bridging-video-and-language/)
### Annotations
#### Annotation process
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
Part of this dataset is generated by large language models and may contain toxic or biased texts.
We mitigate this issue by leveraging [Perspective API](https://developers.perspectiveapi.com/) to filter out highly toxic generations.
## Additional Information
### Dataset Curators
VITATECS is curated by Shicheng Li, Lei Li, Shuhuai Ren, Yuanxin Liu, Yi Liu, Rundong Gao, Xu Sun (Peking University) and Lu Hou (Huawei Noah's Ark Lab).
### Licensing Information
This dataset is under [CC-BY 4.0](https://creativecommons.org/licenses/by/4.0/) license.
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tosh97/huggingface_agg_kor_sorted | ---
dataset_info:
features:
- name: ko
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 3990832726.0
num_examples: 11818226
download_size: 2519201456
dataset_size: 3990832726.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "huggingface_agg_kor_sorted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/ENN_class_embeddings_dim_64 | ---
dataset_info:
features:
- name: last_hs
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 18028896
num_examples: 67272
download_size: 24547776
dataset_size: 18028896
---
# Dataset Card for "ENN_class_embeddings_dim_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kw1018/llama2-template-capjack | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1719457
num_examples: 2500
download_size: 554566
dataset_size: 1719457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-13B-LoRA-assemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-13B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T12:38:31.031518](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble/blob/main/results_2023-10-28T12-38-31.031518.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018246644295302015,\n\
\ \"em_stderr\": 0.0013706682452812897,\n \"f1\": 0.12087667785234917,\n\
\ \"f1_stderr\": 0.002262552570535497,\n \"acc\": 0.4228981679335413,\n\
\ \"acc_stderr\": 0.009810986357152753\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.018246644295302015,\n \"em_stderr\": 0.0013706682452812897,\n\
\ \"f1\": 0.12087667785234917,\n \"f1_stderr\": 0.002262552570535497\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0841546626231994,\n \
\ \"acc_stderr\": 0.0076470240466032045\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702302\n\
\ }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|arc:challenge|25_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T12_38_31.031518
path:
- '**/details_harness|drop|3_2023-10-28T12-38-31.031518.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T12-38-31.031518.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T12_38_31.031518
path:
- '**/details_harness|gsm8k|5_2023-10-28T12-38-31.031518.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T12-38-31.031518.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hellaswag|10_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T23-30-08.066135.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T23-30-08.066135.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T12_38_31.031518
path:
- '**/details_harness|winogrande|5_2023-10-28T12-38-31.031518.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T12-38-31.031518.parquet'
- config_name: results
data_files:
- split: 2023_09_13T23_30_08.066135
path:
- results_2023-09-13T23-30-08.066135.parquet
- split: 2023_10_28T12_38_31.031518
path:
- results_2023-10-28T12-38-31.031518.parquet
- split: latest
path:
- results_2023-10-28T12-38-31.031518.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-13B-LoRA-assemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-13B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-13B-LoRA-assemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T12:38:31.031518](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-13B-LoRA-assemble/blob/main/results_2023-10-28T12-38-31.031518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.018246644295302015,
"em_stderr": 0.0013706682452812897,
"f1": 0.12087667785234917,
"f1_stderr": 0.002262552570535497,
"acc": 0.4228981679335413,
"acc_stderr": 0.009810986357152753
},
"harness|drop|3": {
"em": 0.018246644295302015,
"em_stderr": 0.0013706682452812897,
"f1": 0.12087667785234917,
"f1_stderr": 0.002262552570535497
},
"harness|gsm8k|5": {
"acc": 0.0841546626231994,
"acc_stderr": 0.0076470240466032045
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702302
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jan-hq/openhermes_dpo_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 2974546376.1
num_examples: 890541
- name: test
num_bytes: 330505152.9
num_examples: 98949
download_size: 1766050108
dataset_size: 3305051529.0
---
# Dataset Card for "openhermes_dpo_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
elliotthwang/openassistant-guanaco-chinese_1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1284535.2427381678
num_examples: 1000
download_size: 844856
dataset_size: 1284535.2427381678
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bzantium/LITM | ---
license: apache-2.0
configs:
- config_name: kv75
data_files:
- split: test
path: "data/kv75.jsonl"
- config_name: kv140
data_files:
- split: test
path: "data/kv140.jsonl"
- config_name: kv300
data_files:
- split: test
path: "data/kv300.jsonl"
- config_name: qa10
data_files:
- split: test
path: "data/qa10.jsonl"
- config_name: qa20
data_files:
- split: test
path: "data/qa20.jsonl"
- config_name: qa30
data_files:
- split: test
path: "data/qa30.jsonl"
task_categories:
- question-answering
tags:
- lost-in-the-middle
size_categories:
- n<1K
---
# Datasets for Lost In The Middle
This repository contains datasets used in the paper ["Lost in the Middle: How Language Models Use Long Contexts"](https://arxiv.org/abs/2307.03172), focusing on multi-document question answering and key-value retrieval tasks.
## Datasets Overview
The datasets provided are as follows:
- **Key-Value Retrieval Datasets**
- `kv75`: Key-Value pairs with 75 keys.
- `kv140`: Key-Value pairs with 140 keys.
- `kv300`: Key-Value pairs with 300 keys.
- **Multi-Document Question Answering Datasets**
- `qa10`: Questions with answers derived from 10 documents.
- `qa20`: Questions with answers derived from 20 documents.
- `qa30`: Questions with answers derived from 30 documents.
## Loading the Data
You can load these datasets using the Hugging Face `datasets` library:
```python
from datasets import load_dataset
### Example for loading the kv75 dataset
dataset = load_dataset("bzantium/LITM", "kv75")
### Example for loading the qa20 dataset
dataset = load_dataset("bzantium/LITM", "qa20")
``` |
open-llm-leaderboard/details_psmathur__orca_mini_v3_13b | ---
pretty_name: Evaluation run of psmathur/orca_mini_v3_13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/orca_mini_v3_13b](https://huggingface.co/psmathur/orca_mini_v3_13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v3_13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T15:47:49.456107](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_13b/blob/main/results_2023-10-18T15-47-49.456107.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15383808724832215,\n\
\ \"em_stderr\": 0.0036948628598682874,\n \"f1\": 0.22225880872483197,\n\
\ \"f1_stderr\": 0.0037670501187578413,\n \"acc\": 0.44797935342421163,\n\
\ \"acc_stderr\": 0.010609253699619367\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.15383808724832215,\n \"em_stderr\": 0.0036948628598682874,\n\
\ \"f1\": 0.22225880872483197,\n \"f1_stderr\": 0.0037670501187578413\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \
\ \"acc_stderr\": 0.00929849923558785\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650884\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/orca_mini_v3_13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T15_47_49.456107
path:
- '**/details_harness|drop|3_2023-10-18T15-47-49.456107.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T15-47-49.456107.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T15_47_49.456107
path:
- '**/details_harness|gsm8k|5_2023-10-18T15-47-49.456107.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T15-47-49.456107.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:34:12.529590.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:34:12.529590.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T15_47_49.456107
path:
- '**/details_harness|winogrande|5_2023-10-18T15-47-49.456107.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T15-47-49.456107.parquet'
- config_name: results
data_files:
- split: 2023_08_09T21_34_12.529590
path:
- results_2023-08-09T21:34:12.529590.parquet
- split: 2023_10_18T15_47_49.456107
path:
- results_2023-10-18T15-47-49.456107.parquet
- split: latest
path:
- results_2023-10-18T15-47-49.456107.parquet
---
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v3_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_13b](https://huggingface.co/psmathur/orca_mini_v3_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v3_13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T15:47:49.456107](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_13b/blob/main/results_2023-10-18T15-47-49.456107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15383808724832215,
"em_stderr": 0.0036948628598682874,
"f1": 0.22225880872483197,
"f1_stderr": 0.0037670501187578413,
"acc": 0.44797935342421163,
"acc_stderr": 0.010609253699619367
},
"harness|drop|3": {
"em": 0.15383808724832215,
"em_stderr": 0.0036948628598682874,
"f1": 0.22225880872483197,
"f1_stderr": 0.0037670501187578413
},
"harness|gsm8k|5": {
"acc": 0.13115996967399546,
"acc_stderr": 0.00929849923558785
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650884
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1 | ---
pretty_name: Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [davzoku/frankencria-llama2-11b-v1.3-m.1](https://huggingface.co/davzoku/frankencria-llama2-11b-v1.3-m.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T15:22:38.067991](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1/blob/main/results_2024-02-14T15-22-38.067991.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4805808032934614,\n\
\ \"acc_stderr\": 0.03425901458913262,\n \"acc_norm\": 0.4858393351991251,\n\
\ \"acc_norm_stderr\": 0.03502593471342456,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.46868611616690686,\n\
\ \"mc2_stderr\": 0.015784113350451722\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255795,\n\
\ \"acc_norm\": 0.5281569965870307,\n \"acc_norm_stderr\": 0.014588204105102202\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5941047600079665,\n\
\ \"acc_stderr\": 0.004900608529778612,\n \"acc_norm\": 0.77504481179048,\n\
\ \"acc_norm_stderr\": 0.004166994527570876\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112133,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112133\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790604,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790604\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5258064516129032,\n \"acc_stderr\": 0.02840609505765332,\n \"\
acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.02840609505765332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3842364532019704,\n \"acc_stderr\": 0.03422398565657551,\n \"\
acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.03422398565657551\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n\
\ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6752293577981652,\n \"acc_stderr\": 0.020077729109310327,\n \"\
acc_norm\": 0.6752293577981652,\n \"acc_norm_stderr\": 0.020077729109310327\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488419,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \
\ \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.029872577708891183,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.029872577708891183\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.523121387283237,\n \"acc_stderr\": 0.026890297881303118,\n\
\ \"acc_norm\": 0.523121387283237,\n \"acc_norm_stderr\": 0.026890297881303118\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2111731843575419,\n\
\ \"acc_stderr\": 0.013650276794312202,\n \"acc_norm\": 0.2111731843575419,\n\
\ \"acc_norm_stderr\": 0.013650276794312202\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n\
\ \"acc_stderr\": 0.02825666072336018,\n \"acc_norm\": 0.5498392282958199,\n\
\ \"acc_norm_stderr\": 0.02825666072336018\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115886,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n\
\ \"acc_stderr\": 0.01217730625278669,\n \"acc_norm\": 0.3494132985658409,\n\
\ \"acc_norm_stderr\": 0.01217730625278669\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47549019607843135,\n \"acc_stderr\": 0.02020351728026144,\n \
\ \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.02020351728026144\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49795918367346936,\n \"acc_stderr\": 0.0320089533497105,\n\
\ \"acc_norm\": 0.49795918367346936,\n \"acc_norm_stderr\": 0.0320089533497105\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.46868611616690686,\n\
\ \"mc2_stderr\": 0.015784113350451722\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.012675392786772733\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15011372251705837,\n \
\ \"acc_stderr\": 0.009838590860906968\n }\n}\n```"
repo_url: https://huggingface.co/davzoku/frankencria-llama2-11b-v1.3-m.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|arc:challenge|25_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|gsm8k|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hellaswag|10_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T15-22-38.067991.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- '**/details_harness|winogrande|5_2024-02-14T15-22-38.067991.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T15-22-38.067991.parquet'
- config_name: results
data_files:
- split: 2024_02_14T15_22_38.067991
path:
- results_2024-02-14T15-22-38.067991.parquet
- split: latest
path:
- results_2024-02-14T15-22-38.067991.parquet
---
# Dataset Card for Evaluation run of davzoku/frankencria-llama2-11b-v1.3-m.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davzoku/frankencria-llama2-11b-v1.3-m.1](https://huggingface.co/davzoku/frankencria-llama2-11b-v1.3-m.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T15:22:38.067991](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-11b-v1.3-m.1/blob/main/results_2024-02-14T15-22-38.067991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4805808032934614,
"acc_stderr": 0.03425901458913262,
"acc_norm": 0.4858393351991251,
"acc_norm_stderr": 0.03502593471342456,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.46868611616690686,
"mc2_stderr": 0.015784113350451722
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.5281569965870307,
"acc_norm_stderr": 0.014588204105102202
},
"harness|hellaswag|10": {
"acc": 0.5941047600079665,
"acc_stderr": 0.004900608529778612,
"acc_norm": 0.77504481179048,
"acc_norm_stderr": 0.004166994527570876
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112133,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112133
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790604,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790604
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.024962683564331803,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.024962683564331803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945273,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945273
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6752293577981652,
"acc_stderr": 0.020077729109310327,
"acc_norm": 0.6752293577981652,
"acc_norm_stderr": 0.020077729109310327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891183,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891183
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.523121387283237,
"acc_stderr": 0.026890297881303118,
"acc_norm": 0.523121387283237,
"acc_norm_stderr": 0.026890297881303118
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2111731843575419,
"acc_stderr": 0.013650276794312202,
"acc_norm": 0.2111731843575419,
"acc_norm_stderr": 0.013650276794312202
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.02825666072336018,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.02825666072336018
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115886,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3494132985658409,
"acc_stderr": 0.01217730625278669,
"acc_norm": 0.3494132985658409,
"acc_norm_stderr": 0.01217730625278669
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.02020351728026144,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.02020351728026144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49795918367346936,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.49795918367346936,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.46868611616690686,
"mc2_stderr": 0.015784113350451722
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.012675392786772733
},
"harness|gsm8k|5": {
"acc": 0.15011372251705837,
"acc_stderr": 0.009838590860906968
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/andreana_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of andreana/アンドレアナ/安哲拉 (Arknights)
This is the dataset of andreana/アンドレアナ/安哲拉 (Arknights), containing 98 images and their tags.
The core tags of this character are `short_hair, goggles_on_head, blue_eyes, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 98 | 149.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andreana_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 98 | 128.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andreana_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 247 | 250.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andreana_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/andreana_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | 1girl, black_jacket, goggles, fur-trimmed_jacket, solo, long_sleeves, looking_at_viewer, open_jacket, mouth_mask, black_shirt, black_gloves, closed_mouth, mask_pull, fingerless_gloves, simple_background, upper_body, tentacles, white_background, collarbone, black_pants, smile, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | goggles | fur-trimmed_jacket | solo | long_sleeves | looking_at_viewer | open_jacket | mouth_mask | black_shirt | black_gloves | closed_mouth | mask_pull | fingerless_gloves | simple_background | upper_body | tentacles | white_background | collarbone | black_pants | smile | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------|:---------------------|:-------|:---------------|:--------------------|:--------------|:-------------|:--------------|:---------------|:---------------|:------------|:--------------------|:--------------------|:-------------|:------------|:-------------------|:-------------|:--------------|:--------|:----------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mdd | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- cc-by-3.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- dialogue-modeling
paperswithcode_id: mdd
pretty_name: Movie Dialog dataset (MDD)
dataset_info:
- config_name: task1_qa
features:
- name: dialogue_turns
sequence:
- name: speaker
dtype: int32
- name: utterance
dtype: string
splits:
- name: train
num_bytes: 8621120
num_examples: 96185
- name: test
num_bytes: 894590
num_examples: 9952
- name: validation
num_bytes: 892540
num_examples: 9968
download_size: 135614957
dataset_size: 10408250
- config_name: task2_recs
features:
- name: dialogue_turns
sequence:
- name: speaker
dtype: int32
- name: utterance
dtype: string
splits:
- name: train
num_bytes: 205936579
num_examples: 1000000
- name: test
num_bytes: 2064509
num_examples: 10000
- name: validation
num_bytes: 2057290
num_examples: 10000
download_size: 135614957
dataset_size: 210058378
- config_name: task3_qarecs
features:
- name: dialogue_turns
sequence:
- name: speaker
dtype: int32
- name: utterance
dtype: string
splits:
- name: train
num_bytes: 356789364
num_examples: 952125
- name: test
num_bytes: 1730291
num_examples: 4915
- name: validation
num_bytes: 1776506
num_examples: 5052
download_size: 135614957
dataset_size: 360296161
- config_name: task4_reddit
features:
- name: dialogue_turns
sequence:
- name: speaker
dtype: int32
- name: utterance
dtype: string
splits:
- name: train
num_bytes: 497864160
num_examples: 945198
- name: test
num_bytes: 5220295
num_examples: 10000
- name: validation
num_bytes: 5372702
num_examples: 10000
- name: cand_valid
num_bytes: 1521633
num_examples: 10000
- name: cand_test
num_bytes: 1567235
num_examples: 10000
download_size: 192209920
dataset_size: 511546025
config_names:
- task1_qa
- task2_recs
- task3_qarecs
- task4_reddit
---
# Dataset Card for MDD
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[The bAbI project](https://research.fb.com/downloads/babi/)
- **Repository:**
- **Paper:** [arXiv Paper](https://arxiv.org/pdf/1511.06931.pdf)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The Movie Dialog dataset (MDD) is designed to measure how well models can perform at goal and non-goal orientated dialog centered around the topic of movies (question answering, recommendation and discussion), from various movie reviews sources such as MovieLens and OMDb.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The data is present in English language as written by users on OMDb and MovieLens websites.
## Dataset Structure
### Data Instances
An instance from the `task3_qarecs` config's `train` split:
```
{'dialogue_turns': {'speaker': [0, 1, 0, 1, 0, 1], 'utterance': ["I really like Jaws, Bottle Rocket, Saving Private Ryan, Tommy Boy, The Muppet Movie, Face/Off, and Cool Hand Luke. I'm looking for a Documentary movie.", 'Beyond the Mat', 'Who is that directed by?', 'Barry W. Blaustein', 'I like Jon Fauer movies more. Do you know anything else?', 'Cinematographer Style']}}
```
An instance from the `task4_reddit` config's `cand-valid` split:
```
{'dialogue_turns': {'speaker': [0], 'utterance': ['MORTAL KOMBAT !']}}
```
### Data Fields
For all configurations:
- `dialogue_turns`: a dictionary feature containing:
- `speaker`: an integer with possible values including `0`, `1`, indicating which speaker wrote the utterance.
- `utterance`: a `string` feature containing the text utterance.
### Data Splits
The splits and corresponding sizes are:
|config |train |test |validation|cand_valid|cand_test|
|:--|------:|----:|---------:|----:|----:|
|task1_qa|96185|9952|9968|-|-|
|task2_recs|1000000|10000|10000|-|-|
|task3_qarecs|952125|4915|5052|-|-|
|task4_reddit|945198|10000|10000|10000|10000|
The `cand_valid` and `cand_test` are negative candidates for the `task4_reddit` configuration which is used in ranking true positive against these candidates and hits@k (or another ranking metric) is reported. (See paper)
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The construction of the tasks depended on some existing datasets:
1) MovieLens. The data was downloaded from: http://grouplens.org/datasets/movielens/20m/ on May 27th, 2015.
2) OMDB. The data was downloaded from: http://beforethecode.com/projects/omdb/download.aspx on May 28th, 2015.
3) For `task4_reddit`, the data is a processed subset (movie subreddit only) of the data available at:
https://www.reddit.com/r/datasets/comments/3bxlg7
#### Who are the source language producers?
Users on MovieLens, OMDB website and reddit websites, among others.
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Jesse Dodge and Andreea Gane and Xiang Zhang and Antoine Bordes and Sumit Chopra and Alexander Miller and Arthur Szlam and Jason Weston (at Facebook Research).
### Licensing Information
```
Creative Commons Attribution 3.0 License
```
### Citation Information
```
@misc{dodge2016evaluating,
title={Evaluating Prerequisite Qualities for Learning End-to-End Dialog Systems},
author={Jesse Dodge and Andreea Gane and Xiang Zhang and Antoine Bordes and Sumit Chopra and Alexander Miller and Arthur Szlam and Jason Weston},
year={2016},
eprint={1511.06931},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@gchhablani](https://github.com/gchhablani) for adding this dataset. |
milkshake721/scienceQA-17k | ---
license: apache-2.0
---
|
datatab/SerbianOscarDataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 374855299.3164062
num_examples: 3037283
- name: test
num_bytes: 46856989.550781436
num_examples: 379661
- name: valid
num_bytes: 46856866.13281237
num_examples: 379660
download_size: 328089963
dataset_size: 468569155.0
---
# Dataset Card for "SerbianOscarDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LRGB/voc_superpixels_edge_wt_coord_feat_10 | ---
task_categories:
- graph-ml
size_categories:
- 1M<n<10M
tags:
- lrgb
---
# `voc_superpixels_edge_wt_coord_feat_10`
### Dataset Summary
| Dataset | Domain | Task | Node Feat. (dim) | Edge Feat. (dim) | Perf. Metric |
|---|---|---|---|---|---|
| PascalVOC-SP| Computer Vision | Node Prediction | Pixel + Coord (14) | Edge Weight (1 or 2) | macro F1 |
| Dataset | # Graphs | # Nodes | μ Nodes | μ Deg. | # Edges | μ Edges | μ Short. Path | μ Diameter
|---|---:|---:|---:|:---:|---:|---:|---:|---:|
| PascalVOC-SP| 11,355 | 5,443,545 | 479.40 | 5.65 | 30,777,444 | 2,710.48 | 10.74±0.51 | 27.62±2.13 |
## Additional Information
### Dataset Curators
* Vijay Prakash Dwivedi ([vijaydwivedi75](https://github.com/vijaydwivedi75))
### Licensing Information
[Custom License](http://host.robots.ox.ac.uk/pascal/VOC/voc2011/index.html) for Pascal VOC 2011 (respecting Flickr terms of use)
### Citation Information
```
@article{dwivedi2022LRGB,
title={Long Range Graph Benchmark},
author={Dwivedi, Vijay Prakash and Rampášek, Ladislav and Galkin, Mikhail and Parviz, Ali and Wolf, Guy and Luu, Anh Tuan and Beaini, Dominique},
journal={arXiv:2206.08164},
year={2022}
}
``` |
islam23/News_articles | ---
license: mit
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 200676636
num_examples: 30000
download_size: 24840815
dataset_size: 200676636
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Phonecharger/news-programmatic-labeling | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Business
'1': Sci/Tech
'2': Sports
'3': World
splits:
- name: train
num_bytes: 407587.2
num_examples: 1632
- name: test
num_bytes: 101896.8
num_examples: 408
download_size: 347138
dataset_size: 509484.0
---
# Dataset Card for "news-programmatic-labeling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxm/nq_corpus_dpr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3284289693
num_examples: 5332023
- name: dev
num_bytes: 520583613
num_examples: 849508
download_size: 2568992962
dataset_size: 3804873306
---
# Dataset Card for "nq_corpus_dpr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgallouedec/prj_gia_dataset_metaworld_shelf_place_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the shelf-place-v2 environment, sample for the policy shelf-place-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_shelf_place_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_shelf_place_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
CyberHarem/hatsuzuki_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hatsuzuki/初月/初月 (Kantai Collection)
This is the dataset of hatsuzuki/初月/初月 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `short_hair, brown_hair, headband, yellow_eyes, breasts, hairband, brown_eyes, black_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 518.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 310.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1160 | 661.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 467.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1160 | 915.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsuzuki_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hatsuzuki_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, black_neckerchief, black_skirt, clothes_writing, corset, hachimaki, hair_flaps, looking_at_viewer, serafuku, solo, pleated_skirt, black_bodysuit, black_sailor_collar, simple_background, white_background, black_headband, pantyhose, anchor_symbol, cowboy_shot, gloves |
| 1 | 5 |  |  |  |  |  | 1girl, black_bodysuit, black_gloves, black_sailor_collar, clothes_writing, hachimaki, hair_flaps, serafuku, solo, upper_body, black_headband, black_neckerchief, simple_background, white_background, closed_mouth, looking_at_viewer, short_sleeves, anchor_symbol, medium_breasts |
| 2 | 6 |  |  |  |  |  | 1girl, black_headband, black_sailor_collar, hachimaki, hair_flaps, neckerchief, serafuku, simple_background, solo, upper_body, white_background, anchor_symbol, black_bodysuit, clothes_writing, looking_at_viewer |
| 3 | 12 |  |  |  |  |  | 1girl, black_bodysuit, hair_horns, serafuku, simple_background, solo, white_background, hachimaki, sailor_collar, upper_body, black_neckerchief, black_headband, looking_at_viewer, sidelocks, closed_mouth, open_mouth, short_sleeves, smile |
| 4 | 7 |  |  |  |  |  | 1girl, blush, hair_flaps, simple_background, solo, collarbone, white_background, looking_at_viewer, underwear_only, closed_mouth, small_breasts, ahoge, black_bra, medium_breasts, navel, panties |
| 5 | 15 |  |  |  |  |  | playboy_bunny, rabbit_ears, 1girl, detached_collar, fake_animal_ears, hair_flaps, simple_background, solo, white_background, looking_at_viewer, medium_breasts, black_leotard, blush, strapless_leotard, pantyhose, wrist_cuffs, alternate_costume, necktie, gloves |
| 6 | 13 |  |  |  |  |  | enmaided, 1girl, blush, solo, white_apron, black_dress, frills, hair_flaps, maid_headdress, simple_background, maid_apron, looking_at_viewer, long_sleeves, white_background, black_gloves, closed_mouth, cowboy_shot, puffy_sleeves, short_sleeves |
| 7 | 10 |  |  |  |  |  | 1girl, hair_flaps, solo, competition_swimsuit, hachimaki, cowboy_shot, looking_at_viewer, medium_breasts, smile, blue_one-piece_swimsuit, blush, black_headband, clothes_writing, innertube, simple_background, collarbone, green_eyes, white_background |
| 8 | 7 |  |  |  |  |  | 1girl, hair_flaps, looking_at_viewer, solo, alternate_costume, obi, floral_print, green_eyes, blush, gloves, smile, yukata |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_neckerchief | black_skirt | clothes_writing | corset | hachimaki | hair_flaps | looking_at_viewer | serafuku | solo | pleated_skirt | black_bodysuit | black_sailor_collar | simple_background | white_background | black_headband | pantyhose | anchor_symbol | cowboy_shot | gloves | black_gloves | upper_body | closed_mouth | short_sleeves | medium_breasts | neckerchief | hair_horns | sailor_collar | sidelocks | open_mouth | smile | blush | collarbone | underwear_only | small_breasts | ahoge | black_bra | navel | panties | playboy_bunny | rabbit_ears | detached_collar | fake_animal_ears | black_leotard | strapless_leotard | wrist_cuffs | alternate_costume | necktie | enmaided | white_apron | black_dress | frills | maid_headdress | maid_apron | long_sleeves | puffy_sleeves | competition_swimsuit | blue_one-piece_swimsuit | innertube | green_eyes | obi | floral_print | yukata |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------|:------------------|:---------|:------------|:-------------|:--------------------|:-----------|:-------|:----------------|:-----------------|:----------------------|:--------------------|:-------------------|:-----------------|:------------|:----------------|:--------------|:---------|:---------------|:-------------|:---------------|:----------------|:-----------------|:--------------|:-------------|:----------------|:------------|:-------------|:--------|:--------|:-------------|:-----------------|:----------------|:--------|:------------|:--------|:----------|:----------------|:--------------|:------------------|:-------------------|:----------------|:--------------------|:--------------|:--------------------|:----------|:-----------|:--------------|:--------------|:---------|:-----------------|:-------------|:---------------|:----------------|:-----------------------|:--------------------------|:------------|:-------------|:------|:---------------|:---------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | | X | X | X | X | X | | X | X | X | X | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | X | | X | X | X | X | X | | X | X | X | X | X | | X | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | | | | X | | X | X | X | | X | | X | X | X | | | | | | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | | | | X | X | | X | | | | X | X | | | | | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 15 |  |  |  |  |  | X | | | | | | X | X | | X | | | | X | X | | X | | | X | | | | | X | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 6 | 13 |  |  |  |  |  | X | | | | | | X | X | | X | | | | X | X | | | | X | | X | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | | X | | X | X | X | | X | | | | X | X | X | | | X | | | | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | |
| 8 | 7 |  |  |  |  |  | X | | | | | | X | X | | X | | | | | | | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X |
|
Kamyar-zeinalipour/CW_TR_TEXT_V4 | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 11432212
num_examples: 8000
- name: test
num_bytes: 982031
num_examples: 690
download_size: 6552859
dataset_size: 12414243
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
zolak/twitter_dataset_1713015427 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24329963
num_examples: 60971
download_size: 12215683
dataset_size: 24329963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
idleheroevich2/fwog | ---
license: unknown
---
|
lewtun/cat-toy | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1322754.0
num_examples: 4
download_size: 1265258
dataset_size: 1322754.0
---
# Dataset Card for "cat-toy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HausaNLP/NaijaSenti-Twitter | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
task_ids:
- sentiment-analysis
- sentiment-classification
- sentiment-scoring
- semantic-similarity-classification
- semantic-similarity-scoring
tags:
- sentiment analysis, Twitter, tweets
- sentiment
multilinguality:
- monolingual
- multilingual
size_categories:
- 100K<n<1M
language:
- hau
- ibo
- pcm
- yor
pretty_name: NaijaSenti
---
<p align="center">
<img src="https://raw.githubusercontent.com/hausanlp/NaijaSenti/main/image/naijasenti_logo1.png", width="500">
--------------------------------------------------------------------------------
## Dataset Description
- **Homepage:** https://github.com/hausanlp/NaijaSenti
- **Repository:** [GitHub](https://github.com/hausanlp/NaijaSenti)
- **Paper:** [NaijaSenti: A Nigerian Twitter Sentiment Corpus for Multilingual Sentiment Analysis](https://aclanthology.org/2022.lrec-1.63/)
- **Leaderboard:** N/A
- **Point of Contact:** [Shamsuddeen Hassan Muhammad](shamsuddeen2004@gmail.com)
### Dataset Summary
NaijaSenti is the first large-scale human-annotated Twitter sentiment dataset for the four most widely spoken languages in Nigeria — Hausa, Igbo, Nigerian-Pidgin, and Yorùbá — consisting of around 30,000 annotated tweets per language, including a significant fraction of code-mixed tweets.
### Supported Tasks and Leaderboards
The NaijaSenti can be used for a wide range of sentiment analysis tasks in Nigerian languages, such as sentiment classification, sentiment intensity analysis, and emotion detection. This dataset is suitable for training and evaluating machine learning models for various NLP tasks related to sentiment analysis in African languages. It was part of the datasets that were used for [SemEval 2023 Task 12: Sentiment Analysis for African Languages](https://codalab.lisn.upsaclay.fr/competitions/7320).
### Languages
4 most spoken Nigerian languages
* Hausa (hau)
* Igbo (ibo)
* Nigerian Pidgin (pcm)
* Yoruba (yor)
## Dataset Structure
### Data Instances
For each instance, there is a string for the tweet and a string for the label. See the NaijaSenti [dataset viewer](https://huggingface.co/datasets/HausaNLP/NaijaSenti-Twitter/viewer/hau/train) to explore more examples.
```
{
"tweet": "string",
"label": "string"
}
```
### Data Fields
The data fields are:
```
tweet: a string feature.
label: a classification label, with possible values including positive, negative and neutral.
```
### Data Splits
The NaijaSenti dataset has 3 splits: train, validation, and test. Below are the statistics for Version 1.0.0 of the dataset.
| | hau | ibo | pcm | yor |
|---|---|---|---|---|
| train | 14,172 | 10,192 | 5,121 | 8,522 |
| dev | 2,677 | 1,841 | 1,281 | 2,090 |
| test | 5,303 | 3,682 | 4,154 | 4,515 |
| total | 22,152 | 15,715 | 10,556 | 15,127 |
### How to use it
```python
from datasets import load_dataset
# you can load specific languages (e.g., Hausa). This download train, validation and test sets.
ds = load_dataset("HausaNLP/NaijaSenti-Twitter", "hau")
# train set only
ds = load_dataset("HausaNLP/NaijaSenti-Twitter", "hau", split = "train")
# test set only
ds = load_dataset("HausaNLP/NaijaSenti-Twitter", "hau", split = "test")
# validation set only
ds = load_dataset("HausaNLP/NaijaSenti-Twitter", "hau", split = "validation")
```
## Dataset Creation
### Curation Rationale
NaijaSenti Version 1.0.0 aimed to be used sentiment analysis and other related task in Nigerian indigenous and creole languages - Hausa, Igbo, Nigerian Pidgin and Yoruba.
### Source Data
Twitter
### Personal and Sensitive Information
We anonymized the tweets by replacing all *@mentions* by *@user* and removed all URLs.
## Considerations for Using the Data
### Social Impact of Dataset
The NaijaSenti dataset has the potential to improve sentiment analysis for Nigerian languages, which is essential for understanding and analyzing the diverse perspectives of people in Nigeria. This dataset can enable researchers and developers to create sentiment analysis models that are specific to Nigerian languages, which can be used to gain insights into the social, cultural, and political views of people in Nigerian. Furthermore, this dataset can help address the issue of underrepresentation of Nigerian languages in natural language processing, paving the way for more equitable and inclusive AI technologies.
## Additional Information
### Dataset Curators
* Shamsuddeen Hassan Muhammad
* Idris Abdulmumin
* Ibrahim Said Ahmad
* Bello Shehu Bello
### Licensing Information
This NaijaSenti is licensed under a Creative Commons Attribution BY-NC-SA 4.0 International License
### Citation Information
```
@inproceedings{muhammad-etal-2022-naijasenti,
title = "{N}aija{S}enti: A {N}igerian {T}witter Sentiment Corpus for Multilingual Sentiment Analysis",
author = "Muhammad, Shamsuddeen Hassan and
Adelani, David Ifeoluwa and
Ruder, Sebastian and
Ahmad, Ibrahim Sa{'}id and
Abdulmumin, Idris and
Bello, Bello Shehu and
Choudhury, Monojit and
Emezue, Chris Chinenye and
Abdullahi, Saheed Salahudeen and
Aremu, Anuoluwapo and
Jorge, Al{\'\i}pio and
Brazdil, Pavel",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.63",
pages = "590--602",
}
```
### Contributions
> This work was carried out with support from Lacuna Fund, an initiative co-founded by The Rockefeller Foundation, Google.org, and Canada’s International Development Research Centre. The views expressed herein do not necessarily represent those of Lacuna Fund, its Steering Committee, its funders, or Meridian Institute. |
Jayveersinh-Raj/hindi-abuse-detection-train | ---
language:
- hi
---
# Dataset details
approximately 1000 cleaned labelled dataset in hindi language
# Labels
Binary : hatespeech: 1, Neutral: 0
|
junjiang/cew2B | ---
license: apache-2.0
---
|
Raivatv24/Date_jese | ---
noticia: null
license: apache-2.0
language:
- pt
pretty_name: Date_jese
size_categories:
- n<1K
--- |
ethux/belastingdienst-dataset | ---
license: apache-2.0
language:
- nl
size_categories:
- 1K<n<10K
---
# Dutch GOV Belastingdienst
This dataset is created by scraping https://www.belastingdienst.nl/, I used the titemap to get all possible allowed URLS.
It possible some URLS are missing.
The reason for creating this dataset is I couldn't find any other existing dataset with this data.
So here is this dataset, Enjoy!
### Please note this dataset is not complety checked or cleaned , this is a Work In Progress for me. I did go for easy. |
Nekofox/ja-zh-twitter-translate | ---
license: mit
task_categories:
- translation
language:
- zh
- ja
size_categories:
- n<1K
---
translate by @Nekofoxtweet (me)
twitter source from @RindouMikoto |
fagenorn/cuco-dataset | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
pretty_name: CuCo Style
size_categories:
- n<1K
tags: []
task_categories:
- text-to-image
task_ids: []
--- |
sahithya20/tech | ---
license: unknown
---
|
hpprc/jawiki-slim | ---
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: text
dtype: string
- name: is_disambiguation_page
dtype: bool
- name: is_sexual_page
dtype: bool
- name: is_violent_page
dtype: bool
- name: url
dtype: string
splits:
- name: train
num_bytes: 3826599238
num_examples: 1399160
download_size: 2201709335
dataset_size: 3826599238
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license:
- cc-by-sa-3.0
- gfdl
language:
- ja
--- |
AdapterOcean/med_alpaca_standardized_cluster_28_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 17975351
num_examples: 26823
download_size: 8806330
dataset_size: 17975351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_28_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lhallee/PiNUI_2048 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: SeqA
dtype: string
- name: SeqB
dtype: string
- name: Label
dtype: int64
splits:
- name: train
num_bytes: 1476934338
num_examples: 1547918
- name: test
num_bytes: 1071710
num_examples: 1041
- name: valid
num_bytes: 2455973
num_examples: 3098
download_size: 1330969890
dataset_size: 1480462021
---
# Dataset Card for "PiNUI_2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kalinds/ims_20 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5758
num_examples: 20
download_size: 3644
dataset_size: 5758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mrpc_comparative_as_to | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 1924
num_examples: 6
- name: train
num_bytes: 6011
num_examples: 22
- name: validation
num_bytes: 366
num_examples: 1
download_size: 16892
dataset_size: 8301
---
# Dataset Card for "MULTI_VALUE_mrpc_comparative_as_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/vr_train_free_72 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 5001563766
num_examples: 9713
download_size: 889932755
dataset_size: 5001563766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
eugenkalosha/wikien | ---
license: apache-2.0
---
|
heath1989/sd_prepare | ---
license: apache-2.0
---
|
vitaliy-sharandin/ai-incidents | ---
dataset_info:
features:
- name: _id
dtype: string
- name: incident_id
dtype: int64
- name: date
dtype: timestamp[ns]
- name: reports
dtype: string
- name: Alleged deployer of AI system
dtype: string
- name: Alleged developer of AI system
dtype: string
- name: Alleged harmed or nearly harmed parties
dtype: string
- name: description
dtype: string
- name: title
dtype: string
- name: year
dtype: int64
- name: spacy_negative_outcomes
dtype: string
- name: keybert_negative_outcomes
dtype: string
- name: Cluster
dtype: string
splits:
- name: train
num_bytes: 271118
num_examples: 514
download_size: 165345
dataset_size: 271118
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ai-incidents"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gradjitta/opus-eng-to-fin | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: validation
num_bytes: 249219
num_examples: 2000
- name: train
num_bytes: 86453966
num_examples: 962383
download_size: 65607334
dataset_size: 86703185
---
# Dataset Card for "opus-eng-to-fin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ajay141/ds_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-ccdv__arxiv-summarization-section-002db0-47978145233 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ccdv/arxiv-summarization
eval_info:
task: summarization
model: adityashukzy/bart-base-new-finetuned-arxiv
metrics: []
dataset_name: ccdv/arxiv-summarization
dataset_config: section
dataset_split: validation
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: adityashukzy/bart-base-new-finetuned-arxiv
* Dataset: ccdv/arxiv-summarization
* Config: section
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@adityashukzy](https://huggingface.co/adityashukzy) for evaluating this model. |
presencesw/dataset_2000_decompese_question_4 | ---
dataset_info:
features:
- name: entities
sequence: 'null'
- name: triplets
list:
- name: question
dtype: string
- name: answer
dtype: string
- name: complex_question
dtype: string
splits:
- name: train
num_bytes: 70501
num_examples: 199
download_size: 25954
dataset_size: 70501
---
# Dataset Card for "dataset_2000_decompese_question_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_80_1713146894 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 169946
num_examples: 406
download_size: 93293
dataset_size: 169946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
psyche/bool_sentence | ---
annotations_creators:
- machine-generated
language:
- ko
language_creators:
- found
multilinguality:
- monolingual
pretty_name: psyche/bool_sentence
size_categories:
- 100K<n<1M
source_datasets:
- original
tags: []
task_categories:
- text-classification
task_ids: []
---
|Model| psyche/bool_sentence (10k) |
|:------:|:---:|
|klue/bert-base|0.9335|
licence: cc-by-sa-2.0-kr (원본 출처:국립국어원 표준국어대사전) |
gguichard/wsd_myriade_synth_data_multilabel_xlm | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: float64
splits:
- name: train
num_bytes: 57941762.581044406
num_examples: 96254
- name: test
num_bytes: 3050168.4189555966
num_examples: 5067
download_size: 16635731
dataset_size: 60991931.0
---
# Dataset Card for "wsd_myriade_synth_data_multilabel_xlm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vaibhav9401/testllama | ---
license: apache-2.0
---
|
FanChen0116/bus_few4_8x | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 109163
num_examples: 560
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 18363
dataset_size: 186681
---
# Dataset Card for "bus_few4_8x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlpso/m2m3_qualitative_analysis_ref_cmbert_io | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m2m3_qualitative_analysis_ref_cmbert_io
## Introduction
This dataset was used to perform **qualitative analysis** of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approachrd : M2 and M3
* Dataset type : ground-truth
* Tokenizer : [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner)
* Tagging format : IO
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* M2 : [nlpso/m2_joint_label_ref_cmbert_io](https://huggingface.co/nlpso/m2_joint_label_ref_cmbert_io)
* M3 : [nlpso/m3_hierarchical_ner_ref_cmbert_io](https://huggingface.co/nlpso/m3_hierarchical_ner_ref_cmbert_io)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m2m3_qualitative_analysis_ref_cmbert_io")
|
akoukas/chatgpt-classification-article-level | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Generated
'1': Human
splits:
- name: train
num_bytes: 838416.0019646365
num_examples: 814
- name: test
num_bytes: 105059.49901768172
num_examples: 102
- name: validation
num_bytes: 105059.49901768172
num_examples: 102
download_size: 603800
dataset_size: 1048534.9999999999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
Lompat/colab | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713012859 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9784
num_examples: 24
download_size: 9260
dataset_size: 9784
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713012859"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/shinyou_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shinyou (Kantai Collection)
This is the dataset of shinyou (Kantai Collection), containing 25 images and their tags.
The core tags of this character are `bangs, blonde_hair, blue_eyes, long_hair, side_ponytail, blunt_bangs, hair_ornament, maid_headdress, hair_ribbon, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 19.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinyou_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 15.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinyou_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 56 | 28.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinyou_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 19.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinyou_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 56 | 34.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shinyou_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shinyou_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_apron, green_dress, enmaided, maid_apron, blush, long_sleeves, smile, cowboy_shot, holding, simple_background, frilled_apron, tray |
| 1 | 7 |  |  |  |  |  | 1girl, dougi, smile, solo, blush, upper_body, hakama_short_skirt, red_hakama |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_apron | green_dress | enmaided | maid_apron | blush | long_sleeves | smile | cowboy_shot | holding | simple_background | frilled_apron | tray | dougi | upper_body | hakama_short_skirt | red_hakama |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:--------------|:-----------|:-------------|:--------|:---------------|:--------|:--------------|:----------|:--------------------|:----------------|:-------|:--------|:-------------|:---------------------|:-------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | | | | | X | | X | | | | | | X | X | X | X |
|
EleutherAI/quirky_capitals_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: bob_label
dtype: bool
- name: alice_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 14121.790811339199
num_examples: 128
- name: validation
num_bytes: 31218.416
num_examples: 284
- name: test
num_bytes: 30617.808
num_examples: 278
download_size: 36714
dataset_size: 75958.0148113392
---
# Dataset Card for "quirky_capitals_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
3mrys/daset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_TheBloke__airoboros-13B-HF | ---
pretty_name: Evaluation run of TheBloke/airoboros-13B-HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/airoboros-13B-HF](https://huggingface.co/TheBloke/airoboros-13B-HF)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__airoboros-13B-HF\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T02:12:37.195873](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-13B-HF/blob/main/results_2023-10-23T02-12-37.195873.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.11115771812080537,\n\
\ \"em_stderr\": 0.00321900621779522,\n \"f1\": 0.18403838087248262,\n\
\ \"f1_stderr\": 0.003410322751505753,\n \"acc\": 0.416848524958218,\n\
\ \"acc_stderr\": 0.009523880516878821\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.11115771812080537,\n \"em_stderr\": 0.00321900621779522,\n\
\ \"f1\": 0.18403838087248262,\n \"f1_stderr\": 0.003410322751505753\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \
\ \"acc_stderr\": 0.007086462127954497\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803145\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/airoboros-13B-HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T02_12_37.195873
path:
- '**/details_harness|drop|3_2023-10-23T02-12-37.195873.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T02-12-37.195873.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T02_12_37.195873
path:
- '**/details_harness|gsm8k|5_2023-10-23T02-12-37.195873.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T02-12-37.195873.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:45.973556.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:45.973556.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T02_12_37.195873
path:
- '**/details_harness|winogrande|5_2023-10-23T02-12-37.195873.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T02-12-37.195873.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_05_45.973556
path:
- results_2023-07-19T19:05:45.973556.parquet
- split: 2023_10_23T02_12_37.195873
path:
- results_2023-10-23T02-12-37.195873.parquet
- split: latest
path:
- results_2023-10-23T02-12-37.195873.parquet
---
# Dataset Card for Evaluation run of TheBloke/airoboros-13B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/airoboros-13B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/airoboros-13B-HF](https://huggingface.co/TheBloke/airoboros-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__airoboros-13B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T02:12:37.195873](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-13B-HF/blob/main/results_2023-10-23T02-12-37.195873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779522,
"f1": 0.18403838087248262,
"f1_stderr": 0.003410322751505753,
"acc": 0.416848524958218,
"acc_stderr": 0.009523880516878821
},
"harness|drop|3": {
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779522,
"f1": 0.18403838087248262,
"f1_stderr": 0.003410322751505753
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954497
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803145
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Malvinan/mt5_in_context_language_modeling | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: language
dtype: string
- name: image_list
sequence: string
- name: annotations
sequence: string
- name: input_token_ids
sequence:
sequence: int64
- name: output_token_ids
sequence:
sequence: int64
splits:
- name: train
num_bytes: 61633106122
num_examples: 4903557
- name: validation
num_bytes: 117470776
num_examples: 9173
download_size: 16879210
dataset_size: 61750576898
---
# Dataset Card for "mt5_in_context_language_modeling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eduagarcia/mc4-pt_dedup | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 488218826601
num_examples: 161689320
download_size: 52220169137
dataset_size: 488218826601
---
# MC4-PT (deduplicated)
MC4-PT is the is the portuguese subset from [MC4](http://arxiv.org/abs/2010.11934).
MC4 is a multilingual colossal, cleaned version of Common Crawl's web crawl corpus. Based on Common Crawl dataset: "https://commoncrawl.org".
This version is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022). The raw version is also available [here](https://huggingface.co/datasets/eduagarcia/mc4-pt).
## Data Collection and Processing
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7. |
psroy/mini-platypus-reclor-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6790
num_examples: 7
download_size: 10701
dataset_size: 6790
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
williamwilmer/william10 | ---
license: openrail
---
|
distilled-from-one-sec-cv12/chunk_157 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 909092744
num_examples: 177142
download_size: 927526791
dataset_size: 909092744
---
# Dataset Card for "chunk_157"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arieg/cluster02_large_150 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '000140'
'1': 001259
'2': '004507'
'3': 005940
'4': '006443'
'5': 007483
'6': 007487
'7': 007872
'8': '011237'
'9': 012986
'10': '014541'
'11': '014576'
'12': '014661'
'13': 018037
'14': 018038
'15': '022477'
'16': '024367'
'17': 025668
'18': 028241
'19': 028266
'20': '030056'
'21': '032333'
'22': '032337'
'23': 032339
'24': '035543'
'25': 036999
'26': 039259
'27': 039658
'28': '040657'
'29': '042020'
'30': '042023'
'31': '042025'
'32': '042030'
'33': '042046'
'34': '042372'
'35': '043030'
'36': 043598
'37': '043761'
'38': 043965
'39': 044794
'40': 046839
'41': 047197
'42': 047835
'43': 049394
'44': 049478
'45': '051655'
'46': 051659
'47': '052120'
'48': '052122'
'49': '052123'
'50': '052125'
'51': '053154'
'52': '054153'
'53': 055826
'54': 055830
'55': 055831
'56': '057371'
'57': '057640'
'58': '057665'
'59': 057691
'60': 059678
'61': '060170'
'62': '061160'
'63': '061736'
'64': 061820
'65': 061821
'66': 062592
'67': '064364'
'68': 064629
'69': '066405'
'70': '067366'
'71': '067367'
'72': '070426'
'73': 072149
'74': 072788
'75': 073309
'76': '073467'
'77': 075428
'78': 075784
'79': 075862
'80': '076074'
'81': 076079
'82': 079593
'83': 080518
'84': 085966
'85': 086140
'86': 091443
'87': 094449
'88': 094628
'89': 095908
'90': 096168
'91': 096696
'92': 097374
'93': 099095
'94': '101111'
'95': '101112'
'96': '107432'
'97': '107567'
'98': '108012'
'99': '108529'
'100': '109445'
'101': '109449'
'102': '109450'
'103': '110263'
'104': '111392'
'105': '112197'
'106': '113018'
'107': '113360'
'108': '114036'
'109': '114041'
'110': '116239'
'111': '116735'
'112': '117170'
'113': '119592'
'114': '120196'
'115': '121273'
'116': '122077'
'117': '122082'
'118': '122201'
'119': '122247'
'120': '125190'
'121': '126017'
'122': '126300'
'123': '126411'
'124': '126718'
'125': '128469'
'126': '129887'
'127': '129972'
'128': '130129'
'129': '130709'
'130': '130711'
'131': '131624'
'132': '131787'
'133': '134643'
'134': '134934'
'135': '135028'
'136': '135043'
'137': '135336'
'138': '137898'
'139': '139330'
'140': '139804'
'141': '140421'
'142': '141903'
'143': '144171'
'144': '144551'
'145': '144935'
'146': '145749'
'147': '145780'
'148': '146639'
'149': '148303'
'150': '148518'
'151': '148608'
'152': '149623'
'153': '149953'
splits:
- name: train
num_bytes: 1228242292.7
num_examples: 23100
download_size: 1230798715
dataset_size: 1228242292.7
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adammoss/spcc-images | ---
dataset_info:
features:
- name: image
dtype: image
- name: class
dtype: int64
splits:
- name: train
num_bytes: 2364143.0
num_examples: 201
- name: test
num_bytes: 29272910.141
num_examples: 2001
download_size: 31726930
dataset_size: 31637053.141
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/fujimoto_rina_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fujimoto_rina/藤本里奈/후지모토리나 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of fujimoto_rina/藤本里奈/후지모토리나 (THE iDOLM@STER: Cinderella Girls), containing 150 images and their tags.
The core tags of this character are `blonde_hair, long_hair, earrings, breasts, grey_eyes, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 150 | 148.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimoto_rina_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 150 | 104.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimoto_rina_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 297 | 191.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimoto_rina_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 150 | 137.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimoto_rina_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 297 | 245.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimoto_rina_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fujimoto_rina_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, midriff, hairband, navel, smile, solo, belt, cleavage, looking_at_viewer, nail_polish, blush, open_mouth, heart, necklace, one_eye_closed, microphone, pink_nails, fingerless_gloves, fishnet_thighhighs, hair_bow, idol, miniskirt, stuffed_animal, bra, frilled_skirt, pink_skirt, zettai_ryouiki |
| 1 | 12 |  |  |  |  |  | 1girl, smile, solo, looking_at_viewer, open_mouth, cleavage, necklace, off_shoulder, bare_shoulders, nail_polish, sitting, bracelet, ground_vehicle, jacket, motorcycle, one_eye_closed, short_shorts |
| 2 | 5 |  |  |  |  |  | 1girl, ear_piercing, jewelry, simple_background, smile, solo, looking_at_viewer, white_background, :3, closed_mouth, jacket, shirt, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | midriff | hairband | navel | smile | solo | belt | cleavage | looking_at_viewer | nail_polish | blush | open_mouth | heart | necklace | one_eye_closed | microphone | pink_nails | fingerless_gloves | fishnet_thighhighs | hair_bow | idol | miniskirt | stuffed_animal | bra | frilled_skirt | pink_skirt | zettai_ryouiki | off_shoulder | bare_shoulders | sitting | bracelet | ground_vehicle | jacket | motorcycle | short_shorts | ear_piercing | jewelry | simple_background | white_background | :3 | closed_mouth | shirt | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-----------|:--------|:--------|:-------|:-------|:-----------|:--------------------|:--------------|:--------|:-------------|:--------|:-----------|:-----------------|:-------------|:-------------|:--------------------|:---------------------|:-----------|:-------|:------------|:-----------------|:------|:----------------|:-------------|:-----------------|:---------------|:-----------------|:----------|:-----------|:-----------------|:---------|:-------------|:---------------|:---------------|:----------|:--------------------|:-------------------|:-----|:---------------|:--------|:-----------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | | | X | X | | X | X | X | | X | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X |
|
EgilKarlsen/AA_DistilRoBERTa_Final | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318780.21618997
num_examples: 26057
- name: test
num_bytes: 26774087.073587257
num_examples: 8686
download_size: 147167865
dataset_size: 107092867.28977722
---
# Dataset Card for "AA_DistilRoBERTa_Final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zpn/clearance | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: clearance
size_categories:
- n<1K
source_datasets: []
tags:
- bio
- bio-chem
- molnet
- molecule-net
- biophysics
task_categories:
- other
task_ids: []
---
# Dataset Card for clearance
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: https://moleculenet.org/**
- **Repository: https://github.com/deepchem/deepchem/tree/master**
- **Paper: https://arxiv.org/abs/1703.00564**
### Dataset Summary
`clearance` is a dataset included in [Chemberta-2 benchmarking](https://arxiv.org/pdf/2209.01712.pdf).
## Dataset Structure
### Data Fields
Each split contains
* `smiles`: the [SMILES](https://en.wikipedia.org/wiki/Simplified_molecular-input_line-entry_system) representation of a molecule
* `selfies`: the [SELFIES](https://github.com/aspuru-guzik-group/selfies) representation of a molecule
* `target`:
### Data Splits
The dataset is split into an 80/10/10 train/valid/test split using scaffold split.
### Source Data
#### Initial Data Collection and Normalization
Data was originially generated by the Pande Group at Standford
### Licensing Information
This dataset was originally released under an MIT license
### Citation Information
```
@misc{https://doi.org/10.48550/arxiv.1703.00564,
doi = {10.48550/ARXIV.1703.00564},
url = {https://arxiv.org/abs/1703.00564},
author = {Wu, Zhenqin and Ramsundar, Bharath and Feinberg, Evan N. and Gomes, Joseph and Geniesse, Caleb and Pappu, Aneesh S. and Leswing, Karl and Pande, Vijay},
keywords = {Machine Learning (cs.LG), Chemical Physics (physics.chem-ph), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Physical sciences, FOS: Physical sciences},
title = {MoleculeNet: A Benchmark for Molecular Machine Learning},
publisher = {arXiv},
year = {2017},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
Thanks to [@zanussbaum](https://github.com/zanussbaum) for adding this dataset.
|
Villekom/Capybara-fi-oai-style | ---
dataset_info:
features:
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1624976
num_examples: 1018
- name: test
num_bytes: 105373
num_examples: 54
download_size: 1017553
dataset_size: 1730349
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Modified from https://huggingface.co/datasets/Finnish-NLP/Capybara-fi-deepl-translated-sft to chatml format. |
yunjaeys/Contextual_Response_Evaluation_for_ESL_and_ASD_Support | ---
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- asd
- autism
- esl
- english_second_language
- NLP
- second_language
- phi-2
- openassistant_reward
pretty_name: Contextual Response Evaluation for ESL and ASD Support💜💬🌐
---
# Dataset Card for "Contextual Response Evaluation for ESL and ASD Support💜💬🌐""
## Dataset Description 📖
### Dataset Summary 📝
Curated by Eric Soderquist, this dataset is a collection of English prompts and responses generated by the Phi-2 model, designed to evaluate and improve NLP models for supporting ESL (English as a Second Language) and ASD (Autism Spectrum Disorder) user bases. Each prompt is paired with multiple AI-generated responses and evaluated using a reward model to assess their relevance and quality.
### Supported Tasks and Leaderboards 🎯
- `text-generation`: This dataset is intended to train and refine language models for generating sensitive and context-aware responses.
- `language-modeling`: It can also be used for scoring the quality of language model responses to support ESL and ASD individuals.
### Languages 🗣
The dataset is monolingual and written in English.
## Dataset Structure 🏗
### Data Instances 📜
Each data instance contains a prompt, multiple AI-generated responses to that prompt, and scores reflecting the quality of each response.
### Data Fields 🏛
- `prompt`: a string containing the original English prompt.
- `responses`: an array of strings containing responses generated by the language model.
- `scores`: an array of floats representing the reward model's evaluation of each response.
### Data Splits 🔢
This dataset is not divided into traditional splits and consists of one complete set for evaluation purposes.
## Dataset Creation 🛠
### Curation Rationale 🤔
The dataset was curated with the goal of advancing NLP technologies to better serve ESL and ASD communities, offering a resource to evaluate and enhance the sensitivity of language models in understanding and generating responses that cater to the unique needs of these groups.
### Source Data 🗃
#### Initial Data Collection and Normalization
Data was generated using the Phi-2 model in response to carefully crafted prompts, aiming to cover a range of contexts and challenges faced by ESL and ASD individuals.
#### Annotations 🛑
The dataset includes scores from a reward model, providing an evaluation based on the model's perceived quality and appropriateness of the responses.
### Personal and Sensitive Information 🛑
Responses are generated and do not contain any real personal or sensitive information.
## Considerations for Using the Data ⚖️
### Social Impact of the Dataset 🌍
This dataset has the potential to impact the development of inclusive language models that are attuned to the nuances of communication required by ESL and ASD individuals.
### Discussion of Biases 🧐
As with any language model, biases present in the training data of the Phi-2 model may be reflected in the responses.
### Other Known Limitations 🚧
The reward model's scores are based on its own training data and may not cover the full scope of human evaluative diversity.
## Additional Information 📚
### Dataset Curator 👥
This dataset was curated by Eric Soderquist with the intent to foster developments in NLP that can adapt to and support the diverse linguistic and communicative needs of ESL and ASD communities.
### Licensing Information ©️
The dataset is made available under the MIT license.
### Citation Information 📢
If you use this dataset in your research, please cite it as follows:
```bibtex
@misc{contextual_response_evaluation,
author = {Soderquist, Eric},
title = {Contextual Response Evaluation for ESL and ASD Support},
year = {2024}
}
```
### Contributions 👏
Contributions to further develop and expand this dataset are welcome. |
lhallee/HumanPPI_fold | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: seqs
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 51590813
num_examples: 26319
- name: valid
num_bytes: 475534
num_examples: 234
- name: test
num_bytes: 343668
num_examples: 180
download_size: 43467545
dataset_size: 52410015
---
# Dataset Card for "HumanPPI_fold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoungkim/B_TRAIN | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 22574274612.391
num_examples: 107927
download_size: 10945289804
dataset_size: 22574274612.391
---
# Dataset Card for "B_TRAIN"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-79000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 991798
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_IDEA-CCNL__Ziya2-13B-Base | ---
pretty_name: Evaluation run of IDEA-CCNL/Ziya2-13B-Base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [IDEA-CCNL/Ziya2-13B-Base](https://huggingface.co/IDEA-CCNL/Ziya2-13B-Base) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IDEA-CCNL__Ziya2-13B-Base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:47:57.407157](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya2-13B-Base/blob/main/results_2024-03-29T21-47-57.407157.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6130507830004436,\n\
\ \"acc_stderr\": 0.03280993275771312,\n \"acc_norm\": 0.6149291264443323,\n\
\ \"acc_norm_stderr\": 0.033472041364777376,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4273957149136316,\n\
\ \"mc2_stderr\": 0.014618325186055537\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5042662116040956,\n \"acc_stderr\": 0.014610858923956959,\n\
\ \"acc_norm\": 0.5401023890784983,\n \"acc_norm_stderr\": 0.01456431885692485\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5876319458275244,\n\
\ \"acc_stderr\": 0.004912547040132875,\n \"acc_norm\": 0.7889862577175861,\n\
\ \"acc_norm_stderr\": 0.004071942209838286\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253833,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253833\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764805,\n \"\
acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764805\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518722,\n \
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518722\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246571,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246571\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709592,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709592\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069723,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069723\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n\
\ \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n\
\ \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653342,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653342\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016636,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016636\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4273957149136316,\n\
\ \"mc2_stderr\": 0.014618325186055537\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259776\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.604245640636846,\n \
\ \"acc_stderr\": 0.013469823701048812\n }\n}\n```"
repo_url: https://huggingface.co/IDEA-CCNL/Ziya2-13B-Base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-47-57.407157.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-47-57.407157.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- '**/details_harness|winogrande|5_2024-03-29T21-47-57.407157.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-47-57.407157.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_47_57.407157
path:
- results_2024-03-29T21-47-57.407157.parquet
- split: latest
path:
- results_2024-03-29T21-47-57.407157.parquet
---
# Dataset Card for Evaluation run of IDEA-CCNL/Ziya2-13B-Base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [IDEA-CCNL/Ziya2-13B-Base](https://huggingface.co/IDEA-CCNL/Ziya2-13B-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_IDEA-CCNL__Ziya2-13B-Base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:47:57.407157](https://huggingface.co/datasets/open-llm-leaderboard/details_IDEA-CCNL__Ziya2-13B-Base/blob/main/results_2024-03-29T21-47-57.407157.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6130507830004436,
"acc_stderr": 0.03280993275771312,
"acc_norm": 0.6149291264443323,
"acc_norm_stderr": 0.033472041364777376,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4273957149136316,
"mc2_stderr": 0.014618325186055537
},
"harness|arc:challenge|25": {
"acc": 0.5042662116040956,
"acc_stderr": 0.014610858923956959,
"acc_norm": 0.5401023890784983,
"acc_norm_stderr": 0.01456431885692485
},
"harness|hellaswag|10": {
"acc": 0.5876319458275244,
"acc_stderr": 0.004912547040132875,
"acc_norm": 0.7889862577175861,
"acc_norm_stderr": 0.004071942209838286
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070435,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070435
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253833,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764805,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764805
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246571,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246571
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709592,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709592
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069723,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069723
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653342,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653342
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378459,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378459
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4273957149136316,
"mc2_stderr": 0.014618325186055537
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259776
},
"harness|gsm8k|5": {
"acc": 0.604245640636846,
"acc_stderr": 0.013469823701048812
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dischargesum/discharge_target | ---
dataset_info:
features:
- name: note_id
dtype: string
- name: hadm_id
dtype: int64
- name: discharge_instructions
dtype: string
- name: brief_hospital_course
dtype: string
- name: discharge_instructions_word_count
dtype: int64
- name: brief_hospital_course_word_count
dtype: int64
splits:
- name: train
num_bytes: 232796436
num_examples: 68785
- name: valid
num_bytes: 49727121
num_examples: 14719
- name: test
num_bytes: 49697372
num_examples: 14702
download_size: 185405577
dataset_size: 332220929
---
# Dataset Card for "discharge_target"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MoritzLaurer/cap_sotu | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: string
- name: label_cap2
dtype: int64
- name: label_cap2_text
dtype: string
- name: label_cap4
dtype: int64
- name: year
dtype: int64
- name: president
dtype: string
- name: pres_party
dtype: int64
- name: id_original
dtype: int64
- name: text_original
dtype: string
- name: text_preceding
dtype: string
- name: text_following
dtype: string
- name: doc_id
dtype: int64
splits:
- name: train
num_bytes: 13205826
num_examples: 23040
download_size: 6809027
dataset_size: 13205826
---
# Dataset Card for "cap_sotu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tellarin-ai/ntx_llm_inst_japanese | ---
license: cc-by-sa-4.0
language:
- ja
task_categories:
- token-classification
---
# Dataset Card for NTX v1 in the Aya format - Japanese subset
This dataset is a format conversion for the Japanese data from the original NTX into the Aya instruction format and it's released here under the CC-BY-SA 4.0 license.
## Dataset Details
For the original NTX dataset, the conversion to the Aya instructions format, or more details, please refer to the full dataset in instruction form (https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions) or to the paper below.
**NOTE: ** Unfortunately, due to a conversion issue with numerical expressions, this version here only includes the temporal expressions part of NTX.
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{chen2023dataset,
title={Dataset and Baseline System for Multi-lingual Extraction and Normalization of Temporal and Numerical Expressions},
author={Sanxing Chen and Yongqiang Chen and Börje F. Karlsson},
year={2023},
eprint={2303.18103},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-base | ---
pretty_name: Evaluation run of deepseek-ai/deepseek-coder-6.7b-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [deepseek-ai/deepseek-coder-6.7b-base](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T21:41:57.054032](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-base/blob/main/results_2024-04-02T21-41-57.054032.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3834336760075987,\n\
\ \"acc_stderr\": 0.034332482050115895,\n \"acc_norm\": 0.38623691006245225,\n\
\ \"acc_norm_stderr\": 0.0350884024183875,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.40281312056107804,\n\
\ \"mc2_stderr\": 0.014575988959515906\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3361774744027304,\n \"acc_stderr\": 0.01380485502620576,\n\
\ \"acc_norm\": 0.3703071672354949,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4095797649870544,\n\
\ \"acc_stderr\": 0.004907512103128349,\n \"acc_norm\": 0.5345548695478988,\n\
\ \"acc_norm_stderr\": 0.004977851161904398\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.03894734487013318,\n\
\ \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.03894734487013318\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41132075471698115,\n \"acc_stderr\": 0.030285009259009798,\n\
\ \"acc_norm\": 0.41132075471698115,\n \"acc_norm_stderr\": 0.030285009259009798\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761923,\n\
\ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761923\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972602,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36451612903225805,\n\
\ \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.36451612903225805,\n\
\ \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3696969696969697,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.3696969696969697,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.40404040404040403,\n \"acc_stderr\": 0.03496130972056127,\n \"\
acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.03496130972056127\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.42487046632124353,\n \"acc_stderr\": 0.0356747133521254,\n\
\ \"acc_norm\": 0.42487046632124353,\n \"acc_norm_stderr\": 0.0356747133521254\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.02412112541694119,\n\
\ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150023,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150023\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3853211009174312,\n \"acc_stderr\": 0.020865850852794108,\n \"\
acc_norm\": 0.3853211009174312,\n \"acc_norm_stderr\": 0.020865850852794108\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3480392156862745,\n \"acc_stderr\": 0.03343311240488419,\n \"\
acc_norm\": 0.3480392156862745,\n \"acc_norm_stderr\": 0.03343311240488419\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3206751054852321,\n \"acc_stderr\": 0.030381931949990414,\n \
\ \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.030381931949990414\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.3811659192825112,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.043749285605997376,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.043749285605997376\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3425925925925926,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.3425925925925926,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.03881891213334382,\n\
\ \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.03881891213334382\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091265,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40229885057471265,\n\
\ \"acc_stderr\": 0.017535294529068955,\n \"acc_norm\": 0.40229885057471265,\n\
\ \"acc_norm_stderr\": 0.017535294529068955\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.026424816594009852,\n\
\ \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.026424816594009852\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.015166544550490272,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.015166544550490272\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.02807415894760066,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.02807415894760066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4405144694533762,\n\
\ \"acc_stderr\": 0.02819640057419743,\n \"acc_norm\": 0.4405144694533762,\n\
\ \"acc_norm_stderr\": 0.02819640057419743\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.288135593220339,\n\
\ \"acc_stderr\": 0.011567140661324561,\n \"acc_norm\": 0.288135593220339,\n\
\ \"acc_norm_stderr\": 0.011567140661324561\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.31699346405228757,\n \"acc_stderr\": 0.018824219512706217,\n \
\ \"acc_norm\": 0.31699346405228757,\n \"acc_norm_stderr\": 0.018824219512706217\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4527363184079602,\n\
\ \"acc_stderr\": 0.035197027175769155,\n \"acc_norm\": 0.4527363184079602,\n\
\ \"acc_norm_stderr\": 0.035197027175769155\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079024,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079024\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.03733756969066164,\n\
\ \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.03733756969066164\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.40281312056107804,\n\
\ \"mc2_stderr\": 0.014575988959515906\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5808997632202052,\n \"acc_stderr\": 0.01386732519221011\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17968157695223655,\n \
\ \"acc_stderr\": 0.010575119964242239\n }\n}\n```"
repo_url: https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|arc:challenge|25_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|gsm8k|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hellaswag|10_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-41-57.054032.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T21-41-57.054032.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- '**/details_harness|winogrande|5_2024-04-02T21-41-57.054032.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T21-41-57.054032.parquet'
- config_name: results
data_files:
- split: 2024_04_02T21_41_57.054032
path:
- results_2024-04-02T21-41-57.054032.parquet
- split: latest
path:
- results_2024-04-02T21-41-57.054032.parquet
---
# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-6.7b-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-coder-6.7b-base](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T21:41:57.054032](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-base/blob/main/results_2024-04-02T21-41-57.054032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3834336760075987,
"acc_stderr": 0.034332482050115895,
"acc_norm": 0.38623691006245225,
"acc_norm_stderr": 0.0350884024183875,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.40281312056107804,
"mc2_stderr": 0.014575988959515906
},
"harness|arc:challenge|25": {
"acc": 0.3361774744027304,
"acc_stderr": 0.01380485502620576,
"acc_norm": 0.3703071672354949,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.4095797649870544,
"acc_stderr": 0.004907512103128349,
"acc_norm": 0.5345548695478988,
"acc_norm_stderr": 0.004977851161904398
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.03894734487013318,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.03894734487013318
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41132075471698115,
"acc_stderr": 0.030285009259009798,
"acc_norm": 0.41132075471698115,
"acc_norm_stderr": 0.030285009259009798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761923,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761923
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972602,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36451612903225805,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.36451612903225805,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3696969696969697,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.3696969696969697,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.40404040404040403,
"acc_stderr": 0.03496130972056127,
"acc_norm": 0.40404040404040403,
"acc_norm_stderr": 0.03496130972056127
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.42487046632124353,
"acc_stderr": 0.0356747133521254,
"acc_norm": 0.42487046632124353,
"acc_norm_stderr": 0.0356747133521254
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150023,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150023
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3853211009174312,
"acc_stderr": 0.020865850852794108,
"acc_norm": 0.3853211009174312,
"acc_norm_stderr": 0.020865850852794108
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3480392156862745,
"acc_stderr": 0.03343311240488419,
"acc_norm": 0.3480392156862745,
"acc_norm_stderr": 0.03343311240488419
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3206751054852321,
"acc_stderr": 0.030381931949990414,
"acc_norm": 0.3206751054852321,
"acc_norm_stderr": 0.030381931949990414
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.03881891213334382,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.03881891213334382
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.42718446601941745,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.42718446601941745,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091265,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40229885057471265,
"acc_stderr": 0.017535294529068955,
"acc_norm": 0.40229885057471265,
"acc_norm_stderr": 0.017535294529068955
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.026424816594009852,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.026424816594009852
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490272,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490272
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.02807415894760066,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.02807415894760066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4405144694533762,
"acc_stderr": 0.02819640057419743,
"acc_norm": 0.4405144694533762,
"acc_norm_stderr": 0.02819640057419743
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.288135593220339,
"acc_stderr": 0.011567140661324561,
"acc_norm": 0.288135593220339,
"acc_norm_stderr": 0.011567140661324561
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.31699346405228757,
"acc_stderr": 0.018824219512706217,
"acc_norm": 0.31699346405228757,
"acc_norm_stderr": 0.018824219512706217
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4527363184079602,
"acc_stderr": 0.035197027175769155,
"acc_norm": 0.4527363184079602,
"acc_norm_stderr": 0.035197027175769155
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079024,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.40281312056107804,
"mc2_stderr": 0.014575988959515906
},
"harness|winogrande|5": {
"acc": 0.5808997632202052,
"acc_stderr": 0.01386732519221011
},
"harness|gsm8k|5": {
"acc": 0.17968157695223655,
"acc_stderr": 0.010575119964242239
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
freshpearYoon/train_free_33 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604561360
num_examples: 10000
download_size: 1209954767
dataset_size: 9604561360
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ingram_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ingram/イングラム/MAC-10 (Girls' Frontline)
This is the dataset of ingram/イングラム/MAC-10 (Girls' Frontline), containing 35 images and their tags.
The core tags of this character are `black_hair, long_hair, braid, breasts, green_eyes, twin_braids, bangs, hair_over_one_eye, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 35 | 43.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ingram_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 35 | 23.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ingram_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 81 | 51.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ingram_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 35 | 38.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ingram_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 81 | 75.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ingram_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ingram_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 35 |  |  |  |  |  | 1girl, solo, looking_at_viewer, scar, navel, black_gloves, black_shorts, midriff, short_shorts, bare_shoulders, elbow_gloves, simple_background, holding_gun, smile, fingerless_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | scar | navel | black_gloves | black_shorts | midriff | short_shorts | bare_shoulders | elbow_gloves | simple_background | holding_gun | smile | fingerless_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------|:--------|:---------------|:---------------|:----------|:---------------|:-----------------|:---------------|:--------------------|:--------------|:--------|:--------------------|
| 0 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Arindam0231/AD-Synthetic-raw | ---
dataset_info:
features:
- name: age
dtype: int64
- name: workclass
dtype: string
- name: fnlwgt
dtype: float64
- name: education
dtype: string
- name: education-num
dtype: float64
- name: marital-status
dtype: string
- name: occupation
dtype: string
- name: relationship
dtype: string
- name: race
dtype: string
- name: sex
dtype: string
- name: capital-gain
dtype: float64
- name: capital-loss
dtype: float64
- name: hours-per-week
dtype: float64
- name: native-country
dtype: string
- name: income
dtype: string
splits:
- name: train
num_bytes: 7482940
num_examples: 48842
download_size: 1364348
dataset_size: 7482940
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wahid028/Law_domain_synthetic_data | ---
license: mit
---
|
b-mc2/cli-commands-explained | ---
license: cc0-1.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- terminal
- CLI
- code
- NLP
- commandlinefu
- cheatsheets
pretty_name: cli-commands-explained
size_categories:
- 10K<n<100K
---
#### Overview
This dataset is a collection of **16,098** command line instructions sourced from [Commandlinefu](https://www.commandlinefu.com/commands/browse) and [Cheatsheets](https://github.com/cheat/cheatsheets/tree/master). It includes an array of commands, each with an id, title, description, date, url to source, author, votes, and flag indicating if the description is AI generated. The descriptions are primarily authored by the original contributors, for entries where descriptions were absent, they have been generated using [NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B). Out of the total entries, **10,039** descriptions are originally human-written, while **6,059** have been generated by AI.
Format:
| Key | Description | Type |
|--------------|-----------|------------|
| **id** | ID provided by Commandlinefu, content from Cheatsheets has IDs incremented afterwards | int |
| **votes** | User votes of a command from Commandlinefu, Cheetsheets default to `0`. | int |
| **url** | URL to data source | str |
| **title** | Title provided by source | str |
| **description** | Description provided by author or AI generated by NeuralBeagle14-7B | str |
| **code** | The actual CLI/Terminal Code | str |
| **author** | Author credited with code creation | str |
| **date** | Date code was created (estimate) | str |
| **ai_generated_description** | Flag to indicate if description was human written or AI written | bool |
```
ai_generated_description
False 10039
True 6059
```
#### Cleansing and Augmentation
Cleansing and data augmentation has been done on the combined Commandlinefu and Cheatsheets data. Some content from both sources has been removed due to formatting issues. For Cheatsheets, I attempted to attribute an author and date using results from `git log --diff-filter=A --pretty="format:%ai,%an" --follow $file`
#### TODO
If you have any edits you'd like to see in a version 2 of this dataset, let me know.
Random sample:
```json
{
"id": 13,
"votes": 1219,
"url": "http://www.commandlinefu.com/commands/view/13/run-the-last-command-as-root",
"title": "Run the last command as root",
"description": "Useful when you forget to use sudo for a command. \"!!\" grabs the last run command.",
"code": "sudo !!",
"author": "root",
"date": "2009-01-26 10:26:48",
"ai_generated_description": false
},
{
"id": 71,
"votes": 846,
"url": "http://www.commandlinefu.com/commands/view/71/serve-current-directory-tree-at-httphostname8000",
"title": "Serve current directory tree at http://$HOSTNAME:8000/",
"description": "This Python command, using the module SimpleHTTPServer, creates a basic web server that serves the current directory and its contents over HTTP on port 8000. When executed, it allows anyone with access to the specified URL (in this case, http://$HOSTNAME:8000/) to view and download files from the current directory as if it were a simple website.",
"code": "python -m SimpleHTTPServer",
"author": "pixelbeat",
"date": "2009-02-05 11:57:43",
"ai_generated_description": true
},
```
#### Citing this work
```TeX
@misc{b-mc2_2024_cli-commands-explained,
title = {cli-commands-explained Dataset},
author = {b-mc2},
year = {2023},
url = {https://huggingface.co/datasets/b-mc2/cli-commands-explained},
note = {This dataset was created by modifying data from the following sources: commandlinefu.com, https://github.com/cheat/cheatsheets/tree/master},
}
``` |
GabrielTOP/Yuri | ---
license: openrail
---
|
tilyupo/mmlu | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: int64
- name: task
dtype: string
splits:
- name: train
num_bytes: 9253917279
num_examples: 5613759
- name: validation
num_bytes: 6938682
num_examples: 13957
download_size: 2703116086
dataset_size: 9260855961
---
# Dataset Card for "mmlu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
norkart/norkart-faq | ---
license: apache-2.0
task_categories:
- question-answering
language:
- 'no'
- nb
size_categories:
- n<1K
---
This dataset has been aggregated from the source https://kunnskapsbase.e-torg.no/hc/no, encompassing the entirety of the Frequently Asked Questions (FAQ) section hosted on the aforementioned webpage. |
Nexdata/212_People_48000_Images_of_Multi_person_and_Multi_view_Tracking_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
212 People – 48,000 Images of Multi-person and Multi-view Tracking Data. The data includes males and females, and the age distribution is from children to the elderly. The data diversity includes different age groups, different shooting angles, different human body orientations and postures. For annotation, we adpoted rectangular bounding boxes annotations on human body. This dataset can be used for multiple object tracking and other tasks.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1191?source=Huggingface
## Data size
212 people, there are 11 cameras, 48,000 images
## Population distribution
the race distribution is Asian, the gender distribution is male and female, the age distribution is from children to the elderly
## Collecting environment
indoor scenes
## Data diversity
different ages, different cameras, different human body orientations and postures
## Device
surveillance cameras, the image resolution is 1,920*1,080
## Data format
the image data format is .jpg, the annotation file format is .json
## Annotation content
human body rectangular bounding boxes
## Accuracy
A rectangular bounding box of human body is qualified when the deviation is not more than 5
# Licensing Information
Commercial License
|
liuyanchen1015/MULTI_VALUE_mnli_never_negator | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 70964
num_examples: 315
- name: dev_mismatched
num_bytes: 78283
num_examples: 374
- name: test_matched
num_bytes: 73993
num_examples: 317
- name: test_mismatched
num_bytes: 78307
num_examples: 356
- name: train
num_bytes: 2789509
num_examples: 13057
download_size: 1865624
dataset_size: 3091056
---
# Dataset Card for "MULTI_VALUE_mnli_never_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
itdenismaslyuk/recommendation-llm | ---
license: mit
---
|
medric49/dolly-rag-mix-500 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: res:airedefined/gpt2-dolly-rag
dtype: string
- name: res:airedefined/pythia-14m-dolly-rag
dtype: string
- name: res:gpt-4
dtype: string
splits:
- name: train
num_bytes: 1218152
num_examples: 498
download_size: 601680
dataset_size: 1218152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dolly-rag-rm-training-new"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_tokyotech-llm__Swallow-70b-instruct-hf | ---
pretty_name: Evaluation run of tokyotech-llm/Swallow-70b-instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tokyotech-llm/Swallow-70b-instruct-hf](https://huggingface.co/tokyotech-llm/Swallow-70b-instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tokyotech-llm__Swallow-70b-instruct-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T17:31:51.560670](https://huggingface.co/datasets/open-llm-leaderboard/details_tokyotech-llm__Swallow-70b-instruct-hf/blob/main/results_2023-12-30T17-31-51.560670.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.668542166197933,\n\
\ \"acc_stderr\": 0.031317102785635216,\n \"acc_norm\": 0.6737570900410808,\n\
\ \"acc_norm_stderr\": 0.03193599243400287,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.4799587332250507,\n\
\ \"mc2_stderr\": 0.014270049627097015\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251104,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283509\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6474805815574587,\n\
\ \"acc_stderr\": 0.004767782256040988,\n \"acc_norm\": 0.8514240191196972,\n\
\ \"acc_norm_stderr\": 0.0035494312479073657\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236785,\n\
\ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236785\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983127,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983127\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603617,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603617\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289708,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289708\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \
\ \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246572,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02865749128507199,\n \
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02865749128507199\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8807339449541285,\n \"acc_stderr\": 0.01389572929258896,\n \"\
acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.01389572929258896\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8774509803921569,\n\
\ \"acc_stderr\": 0.023015389732458265,\n \"acc_norm\": 0.8774509803921569,\n\
\ \"acc_norm_stderr\": 0.023015389732458265\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.0225355263526927,\n\
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.0225355263526927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.029763779406874965,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.029763779406874965\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.013182222616720887,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.013182222616720887\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32513966480446926,\n\
\ \"acc_stderr\": 0.01566654278505355,\n \"acc_norm\": 0.32513966480446926,\n\
\ \"acc_norm_stderr\": 0.01566654278505355\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5319148936170213,\n \"acc_stderr\": 0.029766675075873873,\n \
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.029766675075873873\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5208604954367666,\n\
\ \"acc_stderr\": 0.012759117066518005,\n \"acc_norm\": 0.5208604954367666,\n\
\ \"acc_norm_stderr\": 0.012759117066518005\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7189542483660131,\n \"acc_stderr\": 0.018185218954318086,\n \
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.018185218954318086\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072776,\n \"acc_norm\": 0.7727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072776\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429093,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429093\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698296,\n \"mc2\": 0.4799587332250507,\n\
\ \"mc2_stderr\": 0.014270049627097015\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047992\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \
\ \"acc_stderr\": 0.013727093010429785\n }\n}\n```"
repo_url: https://huggingface.co/tokyotech-llm/Swallow-70b-instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|arc:challenge|25_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|gsm8k|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hellaswag|10_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-31-51.560670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T17-31-51.560670.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- '**/details_harness|winogrande|5_2023-12-30T17-31-51.560670.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T17-31-51.560670.parquet'
- config_name: results
data_files:
- split: 2023_12_30T17_31_51.560670
path:
- results_2023-12-30T17-31-51.560670.parquet
- split: latest
path:
- results_2023-12-30T17-31-51.560670.parquet
---
# Dataset Card for Evaluation run of tokyotech-llm/Swallow-70b-instruct-hf
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tokyotech-llm/Swallow-70b-instruct-hf](https://huggingface.co/tokyotech-llm/Swallow-70b-instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tokyotech-llm__Swallow-70b-instruct-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T17:31:51.560670](https://huggingface.co/datasets/open-llm-leaderboard/details_tokyotech-llm__Swallow-70b-instruct-hf/blob/main/results_2023-12-30T17-31-51.560670.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.668542166197933,
"acc_stderr": 0.031317102785635216,
"acc_norm": 0.6737570900410808,
"acc_norm_stderr": 0.03193599243400287,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698296,
"mc2": 0.4799587332250507,
"mc2_stderr": 0.014270049627097015
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251104,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.013822047922283509
},
"harness|hellaswag|10": {
"acc": 0.6474805815574587,
"acc_stderr": 0.004767782256040988,
"acc_norm": 0.8514240191196972,
"acc_norm_stderr": 0.0035494312479073657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983127,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983127
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603617,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289708,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289708
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246572,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02865749128507199,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02865749128507199
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.01389572929258896,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.01389572929258896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8774509803921569,
"acc_stderr": 0.023015389732458265,
"acc_norm": 0.8774509803921569,
"acc_norm_stderr": 0.023015389732458265
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.0225355263526927,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.0225355263526927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.029763779406874965,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.029763779406874965
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.013182222616720887,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.013182222616720887
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32513966480446926,
"acc_stderr": 0.01566654278505355,
"acc_norm": 0.32513966480446926,
"acc_norm_stderr": 0.01566654278505355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.029766675075873873,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.029766675075873873
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5208604954367666,
"acc_stderr": 0.012759117066518005,
"acc_norm": 0.5208604954367666,
"acc_norm_stderr": 0.012759117066518005
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.018185218954318086,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.018185218954318086
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.04013964554072776,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.04013964554072776
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.027265992434429093,
"acc_norm": 0.92,
"acc_norm_stderr": 0.027265992434429093
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698296,
"mc2": 0.4799587332250507,
"mc2_stderr": 0.014270049627097015
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047992
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.013727093010429785
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zhangshuoming/exclude_switch_subset_exebench | ---
dataset_info:
features:
- name: train_real_simple_io
struct:
- name: asm
struct:
- name: code
sequence: string
- name: target
sequence: string
- name: fname
dtype: string
- name: func_def
dtype: string
- name: func_head
dtype: string
- name: func_head_types
dtype: string
- name: path
dtype: string
- name: real_deps
dtype: string
- name: real_exe_wrapper
dtype: string
- name: real_io_pairs
struct:
- name: dummy_funcs
sequence: 'null'
- name: dummy_funcs_seed
sequence: 'null'
- name: input
list:
- name: value
sequence: string
- name: var
sequence: string
- name: output
list:
- name: value
sequence: string
- name: var
sequence: string
- name: real_iospec
dtype: string
- name: ref
dtype: string
- name: signature
sequence: string
- name: synth_deps
dtype: string
- name: synth_exe_wrapper
dtype: string
- name: synth_io_pairs
struct:
- name: dummy_funcs
sequence: string
- name: dummy_funcs_seed
sequence: int64
- name: input
list:
- name: value
sequence: string
- name: var
sequence: string
- name: output
list:
- name: value
sequence: string
- name: var
sequence: string
- name: synth_iospec
dtype: string
splits:
- name: train
num_bytes: 244027105
num_examples: 41662
download_size: 55525871
dataset_size: 244027105
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.