datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
yvillamil/stratio-doc-q-response | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 43028
num_examples: 3
download_size: 20560
dataset_size: 43028
---
# Dataset Card for "stratio-doc-q-response"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anilbhatt1/emlo2s5-sample-flagging-HF-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyzhu/squad_qa_wrong_num_v5_full | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7301105
num_examples: 5070
- name: validation
num_bytes: 346484
num_examples: 300
download_size: 1464054
dataset_size: 7647589
---
# Dataset Card for "squad_qa_wrong_num_v5_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RealTimeData/News_August_2023 | ---
dataset_info:
features:
- name: authors
sequence: string
- name: date_download
dtype: string
- name: date_modify
dtype: string
- name: date_publish
dtype: string
- name: description
dtype: string
- name: filename
dtype: string
- name: image_url
dtype: string
- name: language
dtype: string
- name: localpath
dtype: string
- name: maintext
dtype: string
- name: source_domain
dtype: string
- name: title
dtype: string
- name: title_page
dtype: string
- name: title_rss
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 18194599
num_examples: 5059
download_size: 8541046
dataset_size: 18194599
license: cc
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for "News_August_2023"
This dataset was constructed at 1 Aug 2023, which contains news published from 10 May 2023 to 1 Aug 2023 from various sources.
All news articles in this dataset are in English.
Created from `commoncrawl`. |
DigitalUmuganda/kinyarwanda-english-machine-translation-dataset | ---
pretty_name: parallel corpus
annotations_creators:
- expert-generated
language_creators:
- Digital Umuganda
language:
- en
- rw
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 40K<n<50K
---
# Kinyarwanda English Parallel Datasets for Machine translation
A 48,000 Kinyarwanda English Parallel datasets for machine translation, made by curating and translating normal Kinyarwanda sentences into English |
open-llm-leaderboard/details_bn999__mistral-4.2B | ---
pretty_name: Evaluation run of bn999/mistral-4.2B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bn999/mistral-4.2B](https://huggingface.co/bn999/mistral-4.2B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bn999__mistral-4.2B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T15:25:59.524569](https://huggingface.co/datasets/open-llm-leaderboard/details_bn999__mistral-4.2B/blob/main/results_2024-02-09T15-25-59.524569.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.41637906591897644,\n\
\ \"acc_stderr\": 0.03447539919442628,\n \"acc_norm\": 0.4210183358263366,\n\
\ \"acc_norm_stderr\": 0.03526782026071357,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777305,\n \"mc2\": 0.44821803712567926,\n\
\ \"mc2_stderr\": 0.01462738255861119\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.371160409556314,\n \"acc_stderr\": 0.014117971901142813,\n\
\ \"acc_norm\": 0.4087030716723549,\n \"acc_norm_stderr\": 0.014365750345427008\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45797649870543716,\n\
\ \"acc_stderr\": 0.004972126523031947,\n \"acc_norm\": 0.615116510655248,\n\
\ \"acc_norm_stderr\": 0.004855733568540276\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.03056159042673183,\n\
\ \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.03056159042673183\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.03733626655383509,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.03733626655383509\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993179,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993179\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236784,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236784\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5032258064516129,\n\
\ \"acc_stderr\": 0.028443414226438316,\n \"acc_norm\": 0.5032258064516129,\n\
\ \"acc_norm_stderr\": 0.028443414226438316\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.03282649385304151,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.03282649385304151\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.03883565977956929,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.03883565977956929\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5647668393782384,\n \"acc_stderr\": 0.03578038165008586,\n\
\ \"acc_norm\": 0.5647668393782384,\n \"acc_norm_stderr\": 0.03578038165008586\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.544954128440367,\n\
\ \"acc_stderr\": 0.021350503090925167,\n \"acc_norm\": 0.544954128440367,\n\
\ \"acc_norm_stderr\": 0.021350503090925167\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.49019607843137253,\n \"acc_stderr\": 0.03508637358630573,\n \"\
acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.03508637358630573\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.569620253164557,\n \"acc_stderr\": 0.03223017195937598,\n \
\ \"acc_norm\": 0.569620253164557,\n \"acc_norm_stderr\": 0.03223017195937598\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4125560538116592,\n\
\ \"acc_stderr\": 0.03304062175449296,\n \"acc_norm\": 0.4125560538116592,\n\
\ \"acc_norm_stderr\": 0.03304062175449296\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.04865777570410769,\n\
\ \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.04865777570410769\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5427350427350427,\n\
\ \"acc_stderr\": 0.03263622596380688,\n \"acc_norm\": 0.5427350427350427,\n\
\ \"acc_norm_stderr\": 0.03263622596380688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.49936143039591313,\n\
\ \"acc_stderr\": 0.01787994891443169,\n \"acc_norm\": 0.49936143039591313,\n\
\ \"acc_norm_stderr\": 0.01787994891443169\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.026564178111422625,\n\
\ \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.026564178111422625\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017754,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017754\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.028555827516528777,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.028555827516528777\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.4758842443729904,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.02774431344337654,\n\
\ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.02774431344337654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30141843971631205,\n \"acc_stderr\": 0.027374128882631157,\n \
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.027374128882631157\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35071707953063885,\n\
\ \"acc_stderr\": 0.012187773370741518,\n \"acc_norm\": 0.35071707953063885,\n\
\ \"acc_norm_stderr\": 0.012187773370741518\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3492647058823529,\n \"acc_stderr\": 0.028959755196824852,\n\
\ \"acc_norm\": 0.3492647058823529,\n \"acc_norm_stderr\": 0.028959755196824852\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39215686274509803,\n \"acc_stderr\": 0.01975172650876262,\n \
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.01975172650876262\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n\
\ \"acc_stderr\": 0.03533389234739245,\n \"acc_norm\": 0.5174129353233831,\n\
\ \"acc_norm_stderr\": 0.03533389234739245\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.036459813773888065,\n\
\ \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.036459813773888065\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777305,\n \"mc2\": 0.44821803712567926,\n\
\ \"mc2_stderr\": 0.01462738255861119\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6377269139700079,\n \"acc_stderr\": 0.013508855476252515\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \
\ \"acc_stderr\": 0.008820485491442463\n }\n}\n```"
repo_url: https://huggingface.co/bn999/mistral-4.2B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|arc:challenge|25_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|gsm8k|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hellaswag|10_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T15-25-59.524569.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T15-25-59.524569.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- '**/details_harness|winogrande|5_2024-02-09T15-25-59.524569.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T15-25-59.524569.parquet'
- config_name: results
data_files:
- split: 2024_02_09T15_25_59.524569
path:
- results_2024-02-09T15-25-59.524569.parquet
- split: latest
path:
- results_2024-02-09T15-25-59.524569.parquet
---
# Dataset Card for Evaluation run of bn999/mistral-4.2B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bn999/mistral-4.2B](https://huggingface.co/bn999/mistral-4.2B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bn999__mistral-4.2B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T15:25:59.524569](https://huggingface.co/datasets/open-llm-leaderboard/details_bn999__mistral-4.2B/blob/main/results_2024-02-09T15-25-59.524569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.41637906591897644,
"acc_stderr": 0.03447539919442628,
"acc_norm": 0.4210183358263366,
"acc_norm_stderr": 0.03526782026071357,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777305,
"mc2": 0.44821803712567926,
"mc2_stderr": 0.01462738255861119
},
"harness|arc:challenge|25": {
"acc": 0.371160409556314,
"acc_stderr": 0.014117971901142813,
"acc_norm": 0.4087030716723549,
"acc_norm_stderr": 0.014365750345427008
},
"harness|hellaswag|10": {
"acc": 0.45797649870543716,
"acc_stderr": 0.004972126523031947,
"acc_norm": 0.615116510655248,
"acc_norm_stderr": 0.004855733568540276
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.03056159042673183,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.03056159042673183
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.03733626655383509,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.03733626655383509
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993179,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993179
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236784,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236784
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5032258064516129,
"acc_stderr": 0.028443414226438316,
"acc_norm": 0.5032258064516129,
"acc_norm_stderr": 0.028443414226438316
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.03282649385304151,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.03282649385304151
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.03883565977956929,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.03883565977956929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5647668393782384,
"acc_stderr": 0.03578038165008586,
"acc_norm": 0.5647668393782384,
"acc_norm_stderr": 0.03578038165008586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4307692307692308,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.4307692307692308,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.544954128440367,
"acc_stderr": 0.021350503090925167,
"acc_norm": 0.544954128440367,
"acc_norm_stderr": 0.021350503090925167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.03508637358630573,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.03508637358630573
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.569620253164557,
"acc_stderr": 0.03223017195937598,
"acc_norm": 0.569620253164557,
"acc_norm_stderr": 0.03223017195937598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4125560538116592,
"acc_stderr": 0.03304062175449296,
"acc_norm": 0.4125560538116592,
"acc_norm_stderr": 0.03304062175449296
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.5922330097087378,
"acc_stderr": 0.04865777570410769,
"acc_norm": 0.5922330097087378,
"acc_norm_stderr": 0.04865777570410769
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5427350427350427,
"acc_stderr": 0.03263622596380688,
"acc_norm": 0.5427350427350427,
"acc_norm_stderr": 0.03263622596380688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.49936143039591313,
"acc_stderr": 0.01787994891443169,
"acc_norm": 0.49936143039591313,
"acc_norm_stderr": 0.01787994891443169
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4190751445086705,
"acc_stderr": 0.026564178111422625,
"acc_norm": 0.4190751445086705,
"acc_norm_stderr": 0.026564178111422625
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017754,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017754
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.028555827516528777,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.028555827516528777
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4758842443729904,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.4758842443729904,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.027374128882631157,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.027374128882631157
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35071707953063885,
"acc_stderr": 0.012187773370741518,
"acc_norm": 0.35071707953063885,
"acc_norm_stderr": 0.012187773370741518
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3492647058823529,
"acc_stderr": 0.028959755196824852,
"acc_norm": 0.3492647058823529,
"acc_norm_stderr": 0.028959755196824852
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.01975172650876262,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.01975172650876262
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739245,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739245
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479637,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479637
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777305,
"mc2": 0.44821803712567926,
"mc2_stderr": 0.01462738255861119
},
"harness|winogrande|5": {
"acc": 0.6377269139700079,
"acc_stderr": 0.013508855476252515
},
"harness|gsm8k|5": {
"acc": 0.11599696739954511,
"acc_stderr": 0.008820485491442463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
naorm/desktop-blip | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 12745112.0
num_examples: 51
download_size: 12428273
dataset_size: 12745112.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-33b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T05:59:09.159543](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0/blob/main/results_2023-10-22T05-59-09.159543.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.30432046979865773,\n\
\ \"em_stderr\": 0.004712049527083924,\n \"f1\": 0.37717596476510223,\n\
\ \"f1_stderr\": 0.00456045095000614,\n \"acc\": 0.4400130926002275,\n\
\ \"acc_stderr\": 0.009847939494812614\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.30432046979865773,\n \"em_stderr\": 0.004712049527083924,\n\
\ \"f1\": 0.37717596476510223,\n \"f1_stderr\": 0.00456045095000614\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09628506444275967,\n \
\ \"acc_stderr\": 0.008125264128215886\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409345\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|arc:challenge|25_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T10_51_22.664215
path:
- '**/details_harness|drop|3_2023-10-19T10-51-22.664215.parquet'
- split: 2023_10_21T18_09_50.123692
path:
- '**/details_harness|drop|3_2023-10-21T18-09-50.123692.parquet'
- split: 2023_10_22T05_59_09.159543
path:
- '**/details_harness|drop|3_2023-10-22T05-59-09.159543.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T05-59-09.159543.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T10_51_22.664215
path:
- '**/details_harness|gsm8k|5_2023-10-19T10-51-22.664215.parquet'
- split: 2023_10_21T18_09_50.123692
path:
- '**/details_harness|gsm8k|5_2023-10-21T18-09-50.123692.parquet'
- split: 2023_10_22T05_59_09.159543
path:
- '**/details_harness|gsm8k|5_2023-10-22T05-59-09.159543.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T05-59-09.159543.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hellaswag|10_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T16:13:19.014173.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-02T16:13:19.014173.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T10_51_22.664215
path:
- '**/details_harness|winogrande|5_2023-10-19T10-51-22.664215.parquet'
- split: 2023_10_21T18_09_50.123692
path:
- '**/details_harness|winogrande|5_2023-10-21T18-09-50.123692.parquet'
- split: 2023_10_22T05_59_09.159543
path:
- '**/details_harness|winogrande|5_2023-10-22T05-59-09.159543.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T05-59-09.159543.parquet'
- config_name: results
data_files:
- split: 2023_08_02T16_13_19.014173
path:
- results_2023-08-02T16:13:19.014173.parquet
- split: 2023_10_19T10_51_22.664215
path:
- results_2023-10-19T10-51-22.664215.parquet
- split: 2023_10_21T18_09_50.123692
path:
- results_2023-10-21T18-09-50.123692.parquet
- split: 2023_10_22T05_59_09.159543
path:
- results_2023-10-22T05-59-09.159543.parquet
- split: latest
path:
- results_2023-10-22T05-59-09.159543.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T05:59:09.159543](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0/blob/main/results_2023-10-22T05-59-09.159543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.30432046979865773,
"em_stderr": 0.004712049527083924,
"f1": 0.37717596476510223,
"f1_stderr": 0.00456045095000614,
"acc": 0.4400130926002275,
"acc_stderr": 0.009847939494812614
},
"harness|drop|3": {
"em": 0.30432046979865773,
"em_stderr": 0.004712049527083924,
"f1": 0.37717596476510223,
"f1_stderr": 0.00456045095000614
},
"harness|gsm8k|5": {
"acc": 0.09628506444275967,
"acc_stderr": 0.008125264128215886
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409345
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-eval-aslg_pc12-default-007f32-95707146449 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- aslg_pc12
eval_info:
task: translation
model: HamdanXI/t5_small_gloss_merged_dataset_random_0.1
metrics: ['comet', 'bertscore']
dataset_name: aslg_pc12
dataset_config: default
dataset_split: train
col_mapping:
source: gloss
target: text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Translation
* Model: HamdanXI/t5_small_gloss_merged_dataset_random_0.1
* Dataset: aslg_pc12
* Config: default
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@HamdanXI](https://huggingface.co/HamdanXI) for evaluating this model. |
simsim314/Hebrew_Noam30B_Tokenized | ---
license: mit
---
|
open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-2 | ---
pretty_name: Evaluation run of Josephgflowers/Tinyllama-1.5B-Cinder-Test-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/Tinyllama-1.5B-Cinder-Test-2](https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T21:22:09.465979](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-2/blob/main/results_2024-04-05T21-22-09.465979.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2705480629693628,\n\
\ \"acc_stderr\": 0.031211817677101388,\n \"acc_norm\": 0.2722219261351411,\n\
\ \"acc_norm_stderr\": 0.03204316052559818,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.01497482727975233,\n \"mc2\": 0.4055707932475926,\n\
\ \"mc2_stderr\": 0.014734925050237744\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32337883959044367,\n \"acc_stderr\": 0.01366942163001213,\n\
\ \"acc_norm\": 0.35494880546075086,\n \"acc_norm_stderr\": 0.013983036904094092\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.39693288189603665,\n\
\ \"acc_stderr\": 0.004882619484166598,\n \"acc_norm\": 0.5189205337582155,\n\
\ \"acc_norm_stderr\": 0.004986207581862946\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.02783491252754408,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.02783491252754408\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.02895734278834235,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.02895734278834235\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700307,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700307\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n\
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02865749128507197,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02865749128507197\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603854,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603854\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24403669724770644,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814565,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814565\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2320675105485232,\n \"acc_stderr\": 0.027479744550808507,\n \
\ \"acc_norm\": 0.2320675105485232,\n \"acc_norm_stderr\": 0.027479744550808507\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847835,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847835\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004243,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004243\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\
\ \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.29118773946360155,\n\
\ \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3300653594771242,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.3300653594771242,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.025583062489984838,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.025583062489984838\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495022,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495022\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02525786135943241,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02525786135943241\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n\
\ \"acc_stderr\": 0.010792595553888496,\n \"acc_norm\": 0.23272490221642764,\n\
\ \"acc_norm_stderr\": 0.010792595553888496\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21895424836601307,\n \"acc_stderr\": 0.01672993756553753,\n \
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.01672993756553753\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\
\ \"acc_stderr\": 0.036942843353378,\n \"acc_norm\": 0.18181818181818182,\n\
\ \"acc_norm_stderr\": 0.036942843353378\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3836734693877551,\n \"acc_stderr\": 0.03113088039623592,\n\
\ \"acc_norm\": 0.3836734693877551,\n \"acc_norm_stderr\": 0.03113088039623592\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.01497482727975233,\n \"mc2\": 0.4055707932475926,\n\
\ \"mc2_stderr\": 0.014734925050237744\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.595895816890292,\n \"acc_stderr\": 0.01379161066467085\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-56.896724.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-22-09.465979.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-22-09.465979.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- '**/details_harness|winogrande|5_2024-04-05T21-15-56.896724.parquet'
- split: 2024_04_05T21_22_09.465979
path:
- '**/details_harness|winogrande|5_2024-04-05T21-22-09.465979.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T21-22-09.465979.parquet'
- config_name: results
data_files:
- split: 2024_04_05T21_15_56.896724
path:
- results_2024-04-05T21-15-56.896724.parquet
- split: 2024_04_05T21_22_09.465979
path:
- results_2024-04-05T21-22-09.465979.parquet
- split: latest
path:
- results_2024-04-05T21-22-09.465979.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.5B-Cinder-Test-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.5B-Cinder-Test-2](https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T21:22:09.465979](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-2/blob/main/results_2024-04-05T21-22-09.465979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2705480629693628,
"acc_stderr": 0.031211817677101388,
"acc_norm": 0.2722219261351411,
"acc_norm_stderr": 0.03204316052559818,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.01497482727975233,
"mc2": 0.4055707932475926,
"mc2_stderr": 0.014734925050237744
},
"harness|arc:challenge|25": {
"acc": 0.32337883959044367,
"acc_stderr": 0.01366942163001213,
"acc_norm": 0.35494880546075086,
"acc_norm_stderr": 0.013983036904094092
},
"harness|hellaswag|10": {
"acc": 0.39693288189603665,
"acc_stderr": 0.004882619484166598,
"acc_norm": 0.5189205337582155,
"acc_norm_stderr": 0.004986207581862946
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.02783491252754408,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.02783491252754408
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080342,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080342
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.02895734278834235,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.02895734278834235
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700307,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700307
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.02865749128507197,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.02865749128507197
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603854,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603854
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814565,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814565
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2320675105485232,
"acc_stderr": 0.027479744550808507,
"acc_norm": 0.2320675105485232,
"acc_norm_stderr": 0.027479744550808507
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847835,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847835
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04391326286724071,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04391326286724071
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004243,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004243
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29118773946360155,
"acc_stderr": 0.016246087069701407,
"acc_norm": 0.29118773946360155,
"acc_norm_stderr": 0.016246087069701407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3300653594771242,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.3300653594771242,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.025583062489984838,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.025583062489984838
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495022,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495022
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02525786135943241,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02525786135943241
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23272490221642764,
"acc_stderr": 0.010792595553888496,
"acc_norm": 0.23272490221642764,
"acc_norm_stderr": 0.010792595553888496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.01672993756553753,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.01672993756553753
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3836734693877551,
"acc_stderr": 0.03113088039623592,
"acc_norm": 0.3836734693877551,
"acc_norm_stderr": 0.03113088039623592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.01497482727975233,
"mc2": 0.4055707932475926,
"mc2_stderr": 0.014734925050237744
},
"harness|winogrande|5": {
"acc": 0.595895816890292,
"acc_stderr": 0.01379161066467085
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_qqp_benefactive_dative | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 563127
num_examples: 3350
- name: test
num_bytes: 6098962
num_examples: 35548
- name: train
num_bytes: 5134332
num_examples: 30404
download_size: 7027073
dataset_size: 11796421
---
# Dataset Card for "MULTI_VALUE_qqp_benefactive_dative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Semionn/annotated_youtube_mi_dataset | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1369317
num_examples: 133
download_size: 462178
dataset_size: 1369317
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/dagr_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dagr (Fire Emblem)
This is the dataset of dagr (Fire Emblem), containing 16 images and their tags.
The core tags of this character are `breasts, short_hair, blue_hair, grey_eyes, muscular_female, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 27.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagr_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 14.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagr_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 28.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagr_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 23.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagr_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 41.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dagr_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dagr_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, navel, abs, holding, smile, midriff, muscular, gloves, simple_background, cleavage, full_body, looking_at_viewer, sandals, weapon, white_background, jewelry, bird, teeth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | abs | holding | smile | midriff | muscular | gloves | simple_background | cleavage | full_body | looking_at_viewer | sandals | weapon | white_background | jewelry | bird | teeth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:------|:----------|:--------|:----------|:-----------|:---------|:--------------------|:-----------|:------------|:--------------------|:----------|:---------|:-------------------|:----------|:-------|:--------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Asap7772/persona_gpt4_paired_margin1 | ---
dataset_info:
features:
- name: x
dtype: string
- name: yw
dtype: string
- name: yl
dtype: string
- name: scorew
dtype: int64
- name: scorel
dtype: int64
- name: genw
dtype: string
- name: genl
dtype: string
- name: scorer
dtype: string
- name: scorer_id
dtype: int64
- name: scorerw_id
dtype: int64
- name: scorerl_id
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2332334054
num_examples: 718448
- name: test
num_bytes: 1370201
num_examples: 404
download_size: 53518425
dataset_size: 2333704255
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
DeveloperOats/DBPedia_Classes | ---
annotations_creators: []
language:
- en
language_creators: []
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: 'DBpedia'
size_categories:
- 1M<n<10M
source_datasets: []
tags: []
task_categories:
- text-classification
task_ids:
- topic-classification
---
About Dataset
DBpedia (from "DB" for "database") is a project aiming to extract structured content from the information created in Wikipedia.
This is an extract of the data (after cleaning, kernel included) that provides taxonomic, hierarchical categories ("classes") for 342,782 wikipedia articles. There are 3 levels, with 9, 70 and 219 classes respectively.
A version of this dataset is a popular baseline for NLP/text classification tasks. This version of the dataset is much tougher, especially if the L2/L3 levels are used as the targets.
This is an excellent benchmark for hierarchical multiclass/multilabel text classification.
Some example approaches are included as code snippets.
Content
DBPedia dataset with multiple levels of hierarchy/classes, as a multiclass dataset.
Original DBPedia ontology (triplets data): https://wiki.dbpedia.org/develop/datasets
Listing of the class tree/taxonomy: http://mappings.dbpedia.org/server/ontology/classes/
Acknowledgements
Thanks to the Wikimedia foundation for creating Wikipedia, DBPedia and associated open-data goodness!
Thanks to my colleagues at Sparkbeyond (https://www.sparkbeyond.com) for pointing me towards the taxonomical version of this dataset (as opposed to the classic 14 class version)
Inspiration
Try different NLP models.
See also https://www.kaggle.com/datasets/danofer/dbpedia-classes
Compare to the SOTA in Text Classification on DBpedia - https://paperswithcode.com/sota/text-classification-on-dbpedia |
chinoll/ACGVoice | ---
license: cc-by-nc-sa-4.0
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c63c8cae | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1338
dataset_size: 186
---
# Dataset Card for "c63c8cae"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mostima_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mostima/モスティマ/莫斯提马 (Arknights)
This is the dataset of mostima/モスティマ/莫斯提马 (Arknights), containing 500 images and their tags.
The core tags of this character are `blue_hair, long_hair, horns, blue_eyes, halo, demon_horns, wings, detached_wings, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 999.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mostima_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 455.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mostima_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1299 | 1012.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mostima_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 821.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mostima_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1299 | 1.56 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mostima_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mostima_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, black_jacket, black_shorts, long_sleeves, looking_at_viewer, open_jacket, solo, white_shirt, closed_mouth, cowboy_shot, smile, white_background, black_gloves, holding_staff, short_shorts, simple_background, fur-trimmed_jacket, asymmetrical_gloves, hood, white_gloves |
| 1 | 44 |  |  |  |  |  | 1girl, solo, upper_body, black_jacket, white_shirt, looking_at_viewer, simple_background, open_jacket, smile, fur-trimmed_jacket, white_background, closed_mouth, hood, long_sleeves, white_gloves |
| 2 | 15 |  |  |  |  |  | 1girl, black_gloves, official_alternate_costume, solo, white_dress, holding_staff, cowboy_shot, elbow_gloves, looking_at_viewer, medium_breasts, capelet, closed_mouth, parted_lips, partially_fingerless_gloves |
| 3 | 7 |  |  |  |  |  | 1girl, black_gloves, holding_staff, official_alternate_costume, solo, white_dress, closed_mouth, looking_at_viewer, elbow_gloves, partially_fingerless_gloves, energy_wings, feet_out_of_frame |
| 4 | 5 |  |  |  |  |  | 1girl, black_gloves, holding_staff, official_alternate_costume, solo, white_dress, black_footwear, full_body, looking_at_viewer, short_sleeves, smile, boots, medium_breasts, closed_mouth, elbow_gloves, energy_wings, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | black_shorts | long_sleeves | looking_at_viewer | open_jacket | solo | white_shirt | closed_mouth | cowboy_shot | smile | white_background | black_gloves | holding_staff | short_shorts | simple_background | fur-trimmed_jacket | asymmetrical_gloves | hood | white_gloves | upper_body | official_alternate_costume | white_dress | elbow_gloves | medium_breasts | capelet | parted_lips | partially_fingerless_gloves | energy_wings | feet_out_of_frame | black_footwear | full_body | short_sleeves | boots | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:---------------|:--------------------|:--------------|:-------|:--------------|:---------------|:--------------|:--------|:-------------------|:---------------|:----------------|:---------------|:--------------------|:---------------------|:----------------------|:-------|:---------------|:-------------|:-----------------------------|:--------------|:---------------|:-----------------|:----------|:--------------|:------------------------------|:---------------|:--------------------|:-----------------|:------------|:----------------|:--------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 44 |  |  |  |  |  | X | X | | X | X | X | X | X | X | | X | X | | | | X | X | | X | X | X | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | | | | X | | X | | X | X | | | X | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | X | | X | | X | | | | X | X | | | | | | | | X | X | X | | | | X | X | X | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | X | | X | | X | | X | X | | | | | | | | X | X | X | X | | | | X | | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-106000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 989574
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lucadiliello/trivia_as2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 419044714
num_examples: 1843349
- name: dev
num_bytes: 26773779
num_examples: 117012
- name: test
num_bytes: 26061784
num_examples: 114853
download_size: 184246492
dataset_size: 471880277
---
# Dataset Card for "trivia_as2"
Answer Sentence Selection version of the TriviaQA dataset. For more info, check out the original [repository](https://github.com/lucadiliello/answer-selection). |
iamnguyen/fqa | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: vector
sequence: float64
- name: tokenized_question
dtype: string
splits:
- name: train
num_bytes: 2380872
num_examples: 178
download_size: 1966701
dataset_size: 2380872
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Dahoas/cot_gsm8k_toy | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 516057.4248302619
num_examples: 483
- name: test
num_bytes: 84960.09552691433
num_examples: 78
- name: val
num_bytes: 17781.6015625
num_examples: 17
download_size: 273840
dataset_size: 618799.1219196762
---
# Dataset Card for "cot_gsm8k_toy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepghs/anime_face_detection | ---
license: mit
task_categories:
- object-detection
tags:
- art
size_categories:
- 1K<n<10K
---
Dataset for anime face detection (face only, not the entire head).
| Dataset | Train | Test | Validate | Description |
|:-----------------------:|:-----:|:----:|:--------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| v1.4 | 12798 | 622 | 1217 | Additional images from different categories have been annotated based on the `v1` dataset. Furthermore, all automatically annotated data samples from the `v1` dataset have been manually corrected. |
| v1.4-raw | 4266 | 622 | 1217 | Same as `v1.4`, without any preprocess and data augmentation. Suitable for directly upload to Roboflow platform. |
| v1 | 5943 | 293 | 566 | Primarily consists of illustrations, auto-annotated with [hysts/anime-face-detector](https://github.com/hysts/anime-face-detector), and necessary manual corrections is performed. |
| raw | 1981 | 293 | 566 | Same as `v1`, without any preprocess and data augmentation. Suitable for directly upload to Roboflow platform. |
| Anime Face CreateML.v1i | 4263 | 609 | 1210 | Third-party dataset, source: https://universe.roboflow.com/my-workspace-mph8o/anime-face-createml/dataset/1 |
The best practice is to combine the `Anime Face CreateML.v1i` dataset with the `v1.4` dataset for training. We provide an [online demo](https://huggingface.co/spaces/deepghs/anime_object_detection). |
duongcac/kdllora | ---
license: creativeml-openrail-m
---
|
CyberHarem/okita_souji_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of okita_souji/沖田総司/冲田总司 (Fate/Grand Order)
This is the dataset of okita_souji/沖田総司/冲田总司 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `blonde_hair, short_hair, bow, hair_bow, black_bow, ahoge, yellow_eyes, hair_between_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 840.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 721.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1308 | 1.38 GiB | [Download](https://huggingface.co/datasets/CyberHarem/okita_souji_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/okita_souji_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, hakama_skirt, holding_sword, katana, solo, wide_sleeves, pink_kimono, looking_at_viewer, sheath, cherry_blossoms, closed_mouth, petals |
| 1 | 6 |  |  |  |  |  | 1girl, black_scarf, hakama_skirt, haori, katana, shinsengumi, solo, holding_sword, looking_at_viewer, short_ponytail, white_kimono, wide_sleeves, arm_guards |
| 2 | 27 |  |  |  |  |  | 1girl, black_scarf, haori, holding_sword, katana, short_kimono, solo, black_thighhighs, shinsengumi, looking_at_viewer, white_kimono, obi, sheath, cherry_blossoms, petals, toeless_legwear |
| 3 | 9 |  |  |  |  |  | 1girl, black_scarf, looking_at_viewer, solo, white_kimono, haori, shinsengumi, obi, upper_body, closed_mouth, simple_background, smile, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_thighhighs, feet, looking_at_viewer, obi, short_kimono, sitting, sleeveless_kimono, solo, toes, white_kimono, no_shoes, arm_guards, full_body, stirrup_legwear, black_panties, blush, large_breasts, petals, pink_hair, simple_background, smile, soles, thighs |
| 5 | 9 |  |  |  |  |  | 1girl, black_bikini, black_scarf, katana, looking_at_viewer, single_glove, solo, black_gloves, black_thighhighs, elbow_gloves, highleg_bikini, holding_sword, bare_shoulders, cleavage, layered_bikini, smile, large_breasts, navel, thigh_strap, closed_mouth, blush, thighs |
| 6 | 5 |  |  |  |  |  | 1girl, belt, collared_shirt, katana, long_sleeves, sheath, solo, black_jacket, black_necktie, black_pants, black_suit, closed_mouth, formal, looking_at_viewer, white_background, half_updo, smile, white_shirt, grey_shirt, holding_sword, open_clothes, pant_suit, short_ponytail, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hakama_skirt | holding_sword | katana | solo | wide_sleeves | pink_kimono | looking_at_viewer | sheath | cherry_blossoms | closed_mouth | petals | black_scarf | haori | shinsengumi | short_ponytail | white_kimono | arm_guards | short_kimono | black_thighhighs | obi | toeless_legwear | upper_body | simple_background | smile | white_background | bare_shoulders | feet | sitting | sleeveless_kimono | toes | no_shoes | full_body | stirrup_legwear | black_panties | blush | large_breasts | pink_hair | soles | thighs | black_bikini | single_glove | black_gloves | elbow_gloves | highleg_bikini | cleavage | layered_bikini | navel | thigh_strap | belt | collared_shirt | long_sleeves | black_jacket | black_necktie | black_pants | black_suit | formal | half_updo | white_shirt | grey_shirt | open_clothes | pant_suit |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------------|:---------|:-------|:---------------|:--------------|:--------------------|:---------|:------------------|:---------------|:---------|:--------------|:--------|:--------------|:-----------------|:---------------|:-------------|:---------------|:-------------------|:------|:------------------|:-------------|:--------------------|:--------|:-------------------|:-----------------|:-------|:----------|:--------------------|:-------|:-----------|:------------|:------------------|:----------------|:--------|:----------------|:------------|:--------|:---------|:---------------|:---------------|:---------------|:---------------|:-----------------|:-----------|:-----------------|:--------|:--------------|:-------|:-----------------|:---------------|:---------------|:----------------|:--------------|:-------------|:---------|:------------|:--------------|:-------------|:---------------|:------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 27 |  |  |  |  |  | X | | X | X | X | | | X | X | X | | X | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | | X | | | X | | | X | | X | X | X | | X | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | X | | | | | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | X | X | | | X | | | X | | X | | | | | | | X | | | | | X | | X | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | X | X | | | X | X | | X | | | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
coastalcph/fm-updates-falcon-instruct-7b | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query
struct:
- name: label
dtype: string
- name: objects
list:
- name: aliases
sequence: string
- name: label
dtype: string
- name: qid
dtype: string
- name: qid
dtype: string
- name: rel_id
dtype: string
- name: relation
dtype: string
- name: prediction
struct:
- name: predictions
list:
- name: answer
dtype: string
- name: first_token_probability
dtype: float64
- name: per_token_probability
sequence: float64
- name: perplexity
dtype: float64
- name: query
dtype: string
- name: f1
dtype: float64
- name: relation
dtype: string
- name: type
dtype: string
- name: original_answer
dtype: string
- name: updates
sequence: string
splits:
- name: test
num_bytes: 694312.6861702128
num_examples: 1749
download_size: 383499
dataset_size: 694312.6861702128
---
# Dataset Card for "fm-updates-falcon-instruct-7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iamsubrata/preprocessed_lamini_dataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1846159
num_examples: 1260
- name: test
num_bytes: 205768
num_examples: 140
download_size: 698681
dataset_size: 2051927
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
lansinuote/diffsion_from_scratch | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 119417305.0
num_examples: 833
download_size: 99672356
dataset_size: 119417305.0
---
# Dataset Card for "diffsion_from_scratch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juletxara/tydiqa_xtreme | ---
pretty_name: TyDi QA
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
- ar
- bn
- fi
- id
- ja
- sw
- ko
- ru
- te
- th
license:
- apache-2.0
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: tydi-qa
---
# Dataset Card for "tydiqa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/google-research-datasets/tydiqa](https://github.com/google-research-datasets/tydiqa)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3726.74 MB
- **Size of the generated dataset:** 5812.92 MB
- **Total amount of disk used:** 9539.67 MB
### Dataset Summary
TyDi QA is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs.
The languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language
expresses -- such that we expect models performing well on this set to generalize across a large number of the languages
in the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic
information-seeking task and avoid priming effects, questions are written by people who want to know the answer, but
don’t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without
the use of translation (unlike MLQA and XQuAD).
We also include "translate-train" and "translate-test" splits for each non-English languages from XTREME (Hu et al., 2020). These splits are the automatic translations from English to each target language used in the XTREME paper [https://arxiv.org/abs/2003.11080]. The "translate-train" split purposefully ignores the non-English TyDiQA-GoldP training data to simulate the transfer learning scenario where original-language data is not available and system builders must rely on labeled English data plus existing machine translation systems.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### primary_task
- **Size of downloaded dataset files:** 1863.37 MB
- **Size of the generated dataset:** 5757.59 MB
- **Total amount of disk used:** 7620.96 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"annotations": {
"minimal_answers_end_byte": [-1, -1, -1],
"minimal_answers_start_byte": [-1, -1, -1],
"passage_answer_candidate_index": [-1, -1, -1],
"yes_no_answer": ["NONE", "NONE", "NONE"]
},
"document_plaintext": "\"\\nรองศาสตราจารย์[1] หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร (22 กันยายน 2495 -) ผู้ว่าราชการกรุงเทพมหานครคนที่ 15 อดีตรองหัวหน้าพรรคปร...",
"document_title": "หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร",
"document_url": "\"https://th.wikipedia.org/wiki/%E0%B8%AB%E0%B8%A1%E0%B9%88%E0%B8%AD%E0%B8%A1%E0%B8%A3%E0%B8%B2%E0%B8%8A%E0%B8%A7%E0%B8%87%E0%B8%...",
"language": "thai",
"passage_answer_candidates": "{\"plaintext_end_byte\": [494, 1779, 2931, 3904, 4506, 5588, 6383, 7122, 8224, 9375, 10473, 12563, 15134, 17765, 19863, 21902, 229...",
"question_text": "\"หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร เรียนจบจากที่ไหน ?\"..."
}
```
#### secondary_task
- **Size of downloaded dataset files:** 1863.37 MB
- **Size of the generated dataset:** 55.34 MB
- **Total amount of disk used:** 1918.71 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [394],
"text": ["بطولتين"]
},
"context": "\"أقيمت البطولة 21 مرة، شارك في النهائيات 78 دولة، وعدد الفرق التي فازت بالبطولة حتى الآن 8 فرق، ويعد المنتخب البرازيلي الأكثر تت...",
"id": "arabic-2387335860751143628-1",
"question": "\"كم عدد مرات فوز الأوروغواي ببطولة كاس العالم لكرو القدم؟\"...",
"title": "قائمة نهائيات كأس العالم"
}
```
### Data Fields
The data fields are the same among all splits.
#### primary_task
- `passage_answer_candidates`: a dictionary feature containing:
- `plaintext_start_byte`: a `int32` feature.
- `plaintext_end_byte`: a `int32` feature.
- `question_text`: a `string` feature.
- `document_title`: a `string` feature.
- `language`: a `string` feature.
- `annotations`: a dictionary feature containing:
- `passage_answer_candidate_index`: a `int32` feature.
- `minimal_answers_start_byte`: a `int32` feature.
- `minimal_answers_end_byte`: a `int32` feature.
- `yes_no_answer`: a `string` feature.
- `document_plaintext`: a `string` feature.
- `document_url`: a `string` feature.
#### secondary_task
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name | train | validation |
| -------------- | -----: | ---------: |
| primary_task | 166916 | 18670 |
| secondary_task | 49881 | 5077 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{tydiqa,
title = {TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author = {Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki}
year = {2020},
journal = {Transactions of the Association for Computational Linguistics}
}
```
```
@inproceedings{ruder-etal-2021-xtreme,
title = "{XTREME}-{R}: Towards More Challenging and Nuanced Multilingual Evaluation",
author = "Ruder, Sebastian and
Constant, Noah and
Botha, Jan and
Siddhant, Aditya and
Firat, Orhan and
Fu, Jinlan and
Liu, Pengfei and
Hu, Junjie and
Garrette, Dan and
Neubig, Graham and
Johnson, Melvin",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.802",
doi = "10.18653/v1/2021.emnlp-main.802",
pages = "10215--10245",
}
}
```
|
epinnock/magicoder-evol-instruct-10k-sampled | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
- name: embeddings
sequence: float64
- name: cluster
dtype: int32
- name: generations
sequence: string
splits:
- name: train
num_bytes: 97916601
num_examples: 9863
download_size: 69492196
dataset_size: 97916601
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AniBirage/orca_deduplicated | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5648
num_examples: 1
download_size: 3853
dataset_size: 5648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nithees/bloom3b-ft-llm | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 3040716.0
num_examples: 371
- name: test
num_bytes: 762228.0
num_examples: 93
download_size: 1892995
dataset_size: 3802944.0
---
# Dataset Card for "bloom3b-ft-llm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
afmck/text8-chunked1024 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 90351564
num_examples: 87891
- name: validation
num_bytes: 5019532
num_examples: 4883
- name: test
num_bytes: 5019532
num_examples: 4883
download_size: 55593486
dataset_size: 100390628
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
voiceintelligenceresearch/MOCKS | ---
annotations_creators:
- expert-generated
language:
- en
- de
- es
- fr
- it
license:
- cc-by-4.0
- mpl-2.0
multilinguality:
- multilingual
dataset_info:
- config_name: config
features:
- name: audio_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
---
# MOCKS: Multilingual Open Custom Keyword Spotting Testset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Paper:** [MOCKS 1.0: Multilingual Open Custom Keyword Spotting Testset](https://www.isca-speech.org/archive/pdfs/interspeech_2023/pudo23_interspeech.pdf)
### Dataset Summary
Multilingual Open Custom Keyword Spotting Testset (MOCKS) is a comprehensive audio testset for evaluation and benchmarking
Open-Vocabulary Keyword Spotting (OV-KWS) models. It supports multiple OV-KWS problems:
both text-based and audio-based keyword spotting, as well as offline and online (streaming) modes.
It is based on the LibriSpeech and Mozilla Common Voice datasets and contains
almost 50,000 keywords, with audio data available in English, French, German, Italian, and Spanish.
The testset was generated using automatically generated alignments used for the extraction of parts of the recordings that were split into keywords and test samples.
MOCKS contains both positive and negative examples selected based on phonetic transcriptions that are challenging and should allow for in-depth OV-KWS model evaluation.
Please refer to our [paper](https://www.isca-speech.org/archive/pdfs/interspeech_2023/pudo23_interspeech.pdf) for further details.
### Supported Tasks and Leaderboards
The MOCKS dataset can be used for the Open-Vocabulary Keyword Spotting (OV-KWS) task. It supports two OV-KWS types:
- Query-by-Text, where the keyword is provided by text and needs to be detected in the audio stream.
- Query-by-Example, where the keyword is provided with enrollment audio for detection in the audio stream.
It also allows for:
- offline keyword detection, where test audio is trimmed to contain only keywords of interest.
- online (streaming) keyword detection, where test audio has past and future context besides keywords of interest.
### Languages
The MOCKS incorporates 5 languages:
- English - primary and largest test set,
- German,
- Spanish,
- French,
- Italian.
## Dataset Structure
The MOCKS testset is split by language, source dataset, and OV-KWS type:
```
MOCKS
│
└───de
│ └───MCV
│ │ └───test
│ │ │ └───offline
│ │ │ │ │ all.pair.different.tsv
│ │ │ │ │ all.pair.positive.tsv
│ │ │ │ │ all.pair.similar.tsv
│ │ │ │ │ data.tar.gz
│ │ │ │ │ subset.pair.different.tsv
│ │ │ │ │ subset.pair.positive.tsv
│ │ │ │ │ subset.pair.similar.tsv
│ │ │ │
│ │ │ └───online
│ │ │ │ │ all.pair.different.tsv
│ │ │ │ │ ...
│ │ │ │ data.offline.transcription.tsv
│ │ │ │ data.online.transcription.tsv
│
└───en
│ └───LS-clean
│ │ └───test
│ │ │ └───offline
│ │ │ │ │ all.pair.different.tsv
│ │ │ │ │ ...
│ │ │ │ ...
│ │
│ └───LS-other
│ │ └───test
│ │ │ └───offline
│ │ │ │ │ all.pair.different.tsv
│ │ │ │ │ ...
│ │ │ │ ...
│ │
│ └───MCV
│ │ └───test
│ │ │ └───offline
│ │ │ │ │ all.pair.different.tsv
│ │ │ │ │ ...
│ │ │ │ ...
│
└───...
```
Each split is divided into:
- positive examples (`all.pair.positive.tsv`) - test examples with true keywords, 5000-8000 keywords in each subset,
- similar examples (`all.pair.similar.tsv`) - test examples with similar phrases to the keyword selected based on phonetic transcription distance,
- different examples (`all.pair.different.tsv`) - test examples with completely different phrases.
All those files contain columns separated by tab:
- `keyword_path` - path to audio containing keyword phrase.
- `adversary_keyword_path` - path to test audio.
- `adversary_keyword_timestamp_start` - start time in seconds of phrase of interest for a given keyword from `keyword_path`, the field only available in **offline** split.
- `adversary_keyword_timestamp_end` - end time in seconds of phrase of interest for a given keyword from `keyword_path`, the field only available in **offline** split.
- `label` - whether the `adversary_keyword_path` contain keyword from `keyword_path` or not (1 - contains keyword, 0 - doesn't contain keyword).
Each split also contains a subset of whole data with the same field structure to allow faster evaluation (`subset.pair.*.tsv`).
Also, transcriptions are provided for each audio in:
- `data_offline_transcription.tsv` - transcriptions for **offline** examples and `keyword_path` from **online** scenario,
- `data_online_transcription.tsv` - transcriptions for the adversary, test examples from **online** scenario,
three columns are present within each file:
- `path_to_keyword`/`path_to_adversary_keyword` - path to the audio file,
- `keyword_transcription`/`adversary_keyword_transcription` - audio transcription,
- `keyword_phonetic_transcription`/`adversary_keyword_phonetic_transcription` - audio phonetic transcription.
## Using the Dataset
The dataset can be used by:
- downloading the archive and constructing all the test cases based on the provided `tsv` files,
- `datasets` package.
In the latter case, the following should work:
```
load_dataset(path="voiceintelligenceresearch/MOCKS", name="en.LS-clean", split="offline")
```
The allowed values for `name` are:
- `en.LS-{clean,other}`,
- `en.LS-{clean,other}.positive`,
- `en.LS-{clean,other}.similar`,
- `en.LS-{clean,other}.different`,
- `en.LS-{clean,other}.subset`,
- `en.LS-{clean,other}.positive_subset`,
- `en.LS-{clean,other}.similar_subset`,
- `en.LS-{clean,other}.different_subset`,
- `{de,en,es,fr,it}.MCV.positive`,
- `{de,en,es,fr,it}.MCV.positive.similar`,
- `{de,en,es,fr,it}.MCV.positive.different`,
- `{de,en,es,fr,it}.MCV.positive.subset`,
- `{de,en,es,fr,it}.MCV.positive.positive_subset`,
- `{de,en,es,fr,it}.MCV.positive.similar_subset`,
- `{de,en,es,fr,it}.MCV.positive.different_subset`.
The allowed values for `split` are:
- `offline`,
- `online`.
`load_dataset` provides a list of the dictionary objects with the following contents:
```
{
"keyword_id": datasets.Value("string"),
"keyword_transcription": datasets.Value("string"),
"test_id": datasets.Value("string"),
"test_transcription": datasets.Value("string"),
"test_audio": datasets.Audio(sampling_rate=16000),
"label": datasets.Value("bool"),
}
```
Each element of this list represents a single test case for the QbyT KWS:
- `keyword_id` - the name of the keyword audio file in `data.tar.gz` (not used in QbyT KWS),
- `keyword_transcription` - transcription of the keyword,
- `test_id` - the name of the test audio file in `data.tar.gz`,
- `test_transcription` - transcription of the test sample,
- `test_audio` - raw data of the test audio,
- `label` - `True` if the test case is positive (`keyword_transcription` is a substring of the `test_transcription`), `False` otherwise (`similar` and `different` subsets).
Note that each test case can be extended to QbyE KWS by reading the proper `keyword_id` file. Unfortunately, there is no easy way to do that in the loading script.
All the test files are provided in 16 kHz, even though `{de,en,es,fr,it}.MCV` files are stored in the original sampling (usually 48 kHz) in the `data.tar.gz` archives.
## Dataset Creation
The MOCKS testset was created from LibriSpeech and Mozilla Common Voice (MCV) datasets that are publicly available. To create it:
- a [MFA](https://mfa-models.readthedocs.io/en/latest/acoustic/index.html) with publicly available models was used to extract word-level alignments,
- an internally developed, rule-based grapheme-to-phoneme (G2P) algorithm was used to prepare phonetic transcriptions for each sample.
The data is stored in a 16-bit, single-channel WAV format. 16kHz sampling rate is used for LibriSpeech based testset
and 48kHz sampling rate for MCV based testset.
The offline testset contains an additional 0.1 seconds at the beginning and end of the extracted audio sample to mitigate the cut-speech effect.
The online version contains an additional 1 second or so at the beginning and end of the extracted audio sample.
The MOCKS testset is gender balanced.
## Citation Information
```bibtex
@inproceedings{pudo23_interspeech,
author={Mikołaj Pudo and Mateusz Wosik and Adam Cieślak and Justyna Krzywdziak and Bożena Łukasiak and Artur Janicki},
title={{MOCKS} 1.0: Multilingual Open Custom Keyword Spotting Testset},
year={2023},
booktitle={Proc. Interspeech 2023},
}
``` |
yumin8136/dataset | ---
license: mit
---
|
Ant-j-a/fdr_data | ---
license: gpl-3.0
---
|
Seanxh/twitter_dataset_1713204701 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 134471
num_examples: 315
download_size: 51234
dataset_size: 134471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jkorsvik/cnn_dailymail_nor_v1 | ---
dataset_info:
features:
- name: article
dtype: string
- name: highlights
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 49310937
num_examples: 11490
- name: train
num_bytes: 19132437
num_examples: 23000
- name: validation
num_bytes: 56993804
num_examples: 13368
download_size: 75797719
dataset_size: 125437178
---
# Dataset Card for "cnn_dailymail_nor_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CarperAI/pile-v2-small-filtered | ---
annotations_creators: []
language_creators:
- crowdsourced
language: ["en","code"]
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids:
- language-modeling
---
## Dataset Description
A small subset in each dataset of `pile-v2`(~1000 samples) of [pile-v2]() dataset, each has 1,000 random samples from the original dataset. The dataset has 255MB of text (code and english).
## Languages
The dataset contains technical text on programming languages and natural language with the following subsets,
- Bible
- TED2020
- PileOfLaw
- StackExchange
- GithubIssues
- Opensubtitles
- USPTO
- S2ORC
- DevDocs
- CodePileReddit2022
- USENET
- GNOME
- ASFPublicMail
- PileV2Reddit2020
- CodePilePosts
- Discourse
- Tanzil
- arXiv
- UbuntuIRC
- PubMed
- CodePileReddit2020
- CodePileReddit2021
- GlobalVoices
- FreeLaw_Options
- PileV2Posts
## Dataset Structure
```python
from datasets import load_dataset
load_dataset("CarperAI/pile-v2-small")
```
### How to use it
You can either load the whole dataset like above, or load a specific subset such as arxiv by specifying the folder directory:
```python
load_dataset("CarperAI/pile-v2-small", data_dir="data/arxiv")
```
|
spoartens300/dataset | ---
license: mit
---
|
Yusuf5/OpenCaselistLI | ---
dataset_info:
features:
- name: rowNum
dtype: int64
- name: id
dtype: int64
- name: fileId
dtype: int64
- name: pocket
dtype: string
- name: hat
dtype: string
- name: block
dtype: string
- name: text
dtype: string
- name: fullcite
dtype: string
- name: cite
dtype: string
- name: bucketId
dtype: int64
- name: duplicateCount
dtype: int64
- name: textLength
dtype: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 614838996.9803674
num_examples: 1047870
- name: validate
num_bytes: 77016964.64603722
num_examples: 131260
- name: test
num_bytes: 76885532.3735954
num_examples: 131036
download_size: 236929052
dataset_size: 768741494.0
---
# Dataset Card for "OpenCaselistLI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
buddhist-nlp/pali-english | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: file_name
dtype: string
splits:
- name: train
num_bytes: 34632454.0
num_examples: 132151
- name: validation
num_bytes: 2063756.0
num_examples: 7832
- name: test
num_bytes: 2049351.0
num_examples: 7832
- name: test_500
num_bytes: 124892.0
num_examples: 499
- name: validation_500
num_bytes: 132892.0
num_examples: 499
download_size: 21840989
dataset_size: 39003345.0
---
# Dataset Card for "pali-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andersonbcdefg/inpars-triples-filtered | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: source
dtype: string
- name: __index_level_0__
dtype: int64
- name: pos_score
dtype: float64
- name: neg_score
dtype: float64
- name: margin
dtype: float64
splits:
- name: train
num_bytes: 350597613.6393662
num_examples: 130789
download_size: 210051723
dataset_size: 350597613.6393662
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B | ---
pretty_name: Evaluation run of leveldevai/MarcDareBeagle-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [leveldevai/MarcDareBeagle-7B](https://huggingface.co/leveldevai/MarcDareBeagle-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T08:53:36.117087](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B/blob/main/results_2024-01-19T08-53-36.117087.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6561257128939425,\n\
\ \"acc_stderr\": 0.03198667178637761,\n \"acc_norm\": 0.6554096772583735,\n\
\ \"acc_norm_stderr\": 0.03265522262038939,\n \"mc1\": 0.5410036719706243,\n\
\ \"mc1_stderr\": 0.017444544447661206,\n \"mc2\": 0.6808641879386289,\n\
\ \"mc2_stderr\": 0.015124785314472101\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847632,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.01310678488360133\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7101175064728141,\n\
\ \"acc_stderr\": 0.004527804016253783,\n \"acc_norm\": 0.8832901812387971,\n\
\ \"acc_norm_stderr\": 0.00320418007294237\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063548,\n \
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063548\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045702,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045702\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5410036719706243,\n\
\ \"mc1_stderr\": 0.017444544447661206,\n \"mc2\": 0.6808641879386289,\n\
\ \"mc2_stderr\": 0.015124785314472101\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.01051033695416674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7179681576952237,\n \
\ \"acc_stderr\": 0.012394926584335695\n }\n}\n```"
repo_url: https://huggingface.co/leveldevai/MarcDareBeagle-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|arc:challenge|25_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|gsm8k|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hellaswag|10_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T08-53-36.117087.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- '**/details_harness|winogrande|5_2024-01-19T08-53-36.117087.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T08-53-36.117087.parquet'
- config_name: results
data_files:
- split: 2024_01_19T08_53_36.117087
path:
- results_2024-01-19T08-53-36.117087.parquet
- split: latest
path:
- results_2024-01-19T08-53-36.117087.parquet
---
# Dataset Card for Evaluation run of leveldevai/MarcDareBeagle-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leveldevai/MarcDareBeagle-7B](https://huggingface.co/leveldevai/MarcDareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T08:53:36.117087](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__MarcDareBeagle-7B/blob/main/results_2024-01-19T08-53-36.117087.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6561257128939425,
"acc_stderr": 0.03198667178637761,
"acc_norm": 0.6554096772583735,
"acc_norm_stderr": 0.03265522262038939,
"mc1": 0.5410036719706243,
"mc1_stderr": 0.017444544447661206,
"mc2": 0.6808641879386289,
"mc2_stderr": 0.015124785314472101
},
"harness|arc:challenge|25": {
"acc": 0.6988054607508533,
"acc_stderr": 0.013406741767847632,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.01310678488360133
},
"harness|hellaswag|10": {
"acc": 0.7101175064728141,
"acc_stderr": 0.004527804016253783,
"acc_norm": 0.8832901812387971,
"acc_norm_stderr": 0.00320418007294237
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063548,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063548
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045702,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045702
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5410036719706243,
"mc1_stderr": 0.017444544447661206,
"mc2": 0.6808641879386289,
"mc2_stderr": 0.015124785314472101
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.01051033695416674
},
"harness|gsm8k|5": {
"acc": 0.7179681576952237,
"acc_stderr": 0.012394926584335695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_159 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1233577736.0
num_examples: 242258
download_size: 1258252268
dataset_size: 1233577736.0
---
# Dataset Card for "chunk_159"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-11000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1054067
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ammarnasr/Python-React-Code-Dataset | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: text
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphnanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 10485444.152492668
num_examples: 1432
- name: test
num_bytes: 1354613.944281525
num_examples: 185
- name: valid
num_bytes: 3141239.9032258065
num_examples: 429
download_size: 4774703
dataset_size: 14981298.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
ammaralam/medical_Ar | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36864
num_examples: 98
download_size: 9832
dataset_size: 36864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
spsither/prepare_dataset_train_batch0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 95821956240
num_examples: 99760
download_size: 156274877
dataset_size: 95821956240
---
# Dataset Card for "prepare_dataset_train_batch0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713146984 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2287766
num_examples: 7075
download_size: 1280024
dataset_size: 2287766
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
3ee/regularization-forest | ---
license: mit
tags:
- stable-diffusion
- regularization-images
- text-to-image
- image-to-image
- dreambooth
- class-instance
- preservation-loss-training
- forest
---
# Forest Regularization Images
A collection of regularization & class instance datasets of forests for the Stable Diffusion 1.5 model to use for DreamBooth prior preservation loss training. |
soketlabs/bhasha-wiki-translated | ---
license: cc-by-sa-4.0
dataset_info:
- config_name: wiki_translated
splits:
- name: train
num_bytes: 20200062385
num_examples: 6407814
download_size: 11630929031
dataset_size: 20200062385
configs:
- config_name: wiki_translated
data_files:
- wiki_translated/*.parquet
language:
- hi
- gu
- ur
- bn
- kn
- ta
- en
size_categories:
- 1M<n<10M
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
tags:
- indic
---
# Bhasha Wikipedia Translated
<!-- Provide a quick summary of the dataset. -->
Translated wikipedia articles
## Dataset Details
Dataset is being updated
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
We have translated 6.185 million English wikipedia articles into 6 Indic languages. The translations were done using IndicTrans2 model.
- **Curated by:** [Soket AI labs](https://soket.ai/)
- **Language(s) (NLP):** Hindi, Bengali, Gujarati, Tamil, Kannada, Urdu
- **License:** cc-by-sa-4.0
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
For pretraining or Fine tuning for Indic language models
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
Wikipedia articles
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
billalchaouche/8dretnaEN | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: start_time
dtype: string
- name: end_time
dtype: string
splits:
- name: train
num_bytes: 45658152.408
num_examples: 1503
- name: validation
num_bytes: 2550103.0
num_examples: 111
download_size: 51144073
dataset_size: 48208255.408
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
AlekseyKorshuk/test-conversation-with-system | ---
dataset_info:
features:
- name: system
dtype: string
- name: conversation
list:
- name: from
dtype: string
- name: role_type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 67027776
num_examples: 10000
download_size: 24743939
dataset_size: 67027776
---
# Dataset Card for "test-conversation-with-system"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coastalcph/fm_classifier_mutable-1-n | ---
dataset_info:
features:
- name: query
dtype: string
- name: answer
list:
- name: wikidata_id
dtype: string
- name: name
dtype: string
- name: id
dtype: string
- name: relation
dtype: string
- name: date
dtype: int64
- name: type
dtype: string
- name: is_mutable
dtype: int64
splits:
- name: train
num_bytes: 1608732.147303521
num_examples: 8977
- name: all_fm
num_bytes: 30017653.417646818
num_examples: 157125
- name: validation
num_bytes: 1016408.1453548166
num_examples: 5916
- name: test
num_bytes: 1125889.2970730583
num_examples: 5724
download_size: 7539663
dataset_size: 33768683.00737821
---
# Dataset Card for "fm_classifier_mutable-1-n"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/SpeechTranslation_CoVoST2-zh-CN_en | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: file
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 265052595.194
num_examples: 4898
download_size: 236600585
dataset_size: 265052595.194
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
airesearch/UD_Thai-PUD | ---
dataset_info:
features:
- name: words
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': PUNCT
'1': ADP
'2': VERB
'3': PART
'4': NOUN
'5': ADJ
'6': AUX
'7': DET
'8': ADV
'9': PROPN
'10': CCONJ
'11': PRON
'12': NUM
'13': SYM
'14': SCONJ
'15': X
splits:
- name: test
num_bytes: 561336
num_examples: 1000
download_size: 92891
dataset_size: 561336
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
braindao/Enhanced-Slither-Audited-Solidity-QA | ---
dataset_info:
features:
- name: results
dtype: string
- name: source_code
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 275448756
num_examples: 9477
download_size: 81424292
dataset_size: 275448756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Enhanced-Slither-Audited-Solidity-QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-61000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 652598
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jkorsvik/cnn_daily_mail_nor_final | ---
dataset_info:
features:
- name: article
dtype: string
- name: highlights
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 192449381.25865006
num_examples: 47171
- name: validation
num_bytes: 14487455.276535718
num_examples: 3551
- name: test
num_bytes: 22993888.464814223
num_examples: 5636
download_size: 146363858
dataset_size: 229930725.0
---
# Dataset Card for "cnn_daily_mail_nor_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NbAiLab/salmon-asr-smj | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
- name: duration
dtype: float64
splits:
- name: train
num_bytes: 3425289656.938
num_examples: 18657
- name: validation
num_bytes: 20146487.0
num_examples: 100
- name: test
num_bytes: 19303449.0
num_examples: 100
download_size: 3896709446
dataset_size: 3464739592.938
---
# Dataset Card for "salmon-asr-smj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Xilabs/PIPPA-alpaca | ---
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- text-generation
configs:
- config_name: default
data_files:
- split: smol_pippa_named_users
path: data/smol_pippa_named_users-*
- split: smol_pippa
path: data/smol_pippa-*
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: smol_pippa_named_users
num_bytes: 77441911
num_examples: 38199
- name: smol_pippa
num_bytes: 68511557
num_examples: 38232
download_size: 64841938
dataset_size: 145953468
tags:
- not-for-all-audiences
- alpaca
- conversational
- roleplay
---
# Dataset Card for "Pippa-alpaca"
This dataset is derived from the PIPPA dataset, and uses the alpaca format.
[PIPPA - Personal Interaction Pairs between People and AI](https://huggingface.co/datasets/PygmalionAI/PIPPA) |
kyujinpy/KoCoT_2000 | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
- text-classification
language:
- en
size_categories:
- 1k<n<5k
---
# KoCoT-Collection
Using DeepL dataset, translation about [kaist-CoT](https://huggingface.co/datasets/kaist-ai/CoT-Collection).
---
# Original Dataset Card for Dataset Name
## Dataset Description
- **Homepage:https://github.com/kaistAI/CoT-Collection**
- **Repository:https://github.com/kaistAI/CoT-Collection**
- **Paper:https://arxiv.org/abs/2305.14045**
- **Point of Contact:sejune@lklab.io**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
| name | train |
|-------------------|------:|
|CoT-Collection|1837928|
## Additional Information
### Citation Information
```
@article{kim2023cot,
title={The CoT Collection: Improving Zero-shot and Few-shot Learning of Language Models via Chain-of-Thought Fine-Tuning},
author={Kim, Seungone and Joo, Se June and Kim, Doyoung and Jang, Joel and Ye, Seonghyeon and Shin, Jamin and Seo, Minjoon},
journal={arXiv preprint arXiv:2305.14045},
year={2023}
}
``` |
hemantk089/llama2_7b_fine_tuning_all_tasks_w_new_data_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: task
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: query
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 3427150
num_examples: 2952
- name: test
num_bytes: 847687
num_examples: 740
download_size: 1335408
dataset_size: 4274837
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-project-dane-2d14d683-10645434 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- dane
eval_info:
task: entity_extraction
model: saattrupdan/nbailab-base-ner-scandi
metrics: []
dataset_name: dane
dataset_config: default
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: saattrupdan/nbailab-base-ner-scandi
* Dataset: dane
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@KennethEnevoldsen](https://huggingface.co/KennethEnevoldsen) for evaluating this model. |
open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.0-7b-SFT | ---
pretty_name: Evaluation run of Locutusque/OpenCerebrum-1.0-7b-SFT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/OpenCerebrum-1.0-7b-SFT](https://huggingface.co/Locutusque/OpenCerebrum-1.0-7b-SFT)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.0-7b-SFT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T20:47:49.093705](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.0-7b-SFT/blob/main/results_2024-03-27T20-47-49.093705.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6250740164302089,\n\
\ \"acc_stderr\": 0.0326447970500135,\n \"acc_norm\": 0.6301788256025125,\n\
\ \"acc_norm_stderr\": 0.033308458068719876,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.41447253861290334,\n\
\ \"mc2_stderr\": 0.014262997972832498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.01447113339264247,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946712\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6271659032065325,\n\
\ \"acc_stderr\": 0.00482570253392041,\n \"acc_norm\": 0.8325034853614818,\n\
\ \"acc_norm_stderr\": 0.0037265541293484703\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315525,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315525\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147892,\n \
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147892\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150023,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150023\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295838,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295838\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358978,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358978\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.02881472242225418,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.02881472242225418\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675602,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675602\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616915,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.41447253861290334,\n\
\ \"mc2_stderr\": 0.014262997972832498\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987736\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \
\ \"acc_stderr\": 0.01346085235709565\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/OpenCerebrum-1.0-7b-SFT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|arc:challenge|25_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|gsm8k|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hellaswag|10_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T20-47-49.093705.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T20-47-49.093705.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- '**/details_harness|winogrande|5_2024-03-27T20-47-49.093705.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T20-47-49.093705.parquet'
- config_name: results
data_files:
- split: 2024_03_27T20_47_49.093705
path:
- results_2024-03-27T20-47-49.093705.parquet
- split: latest
path:
- results_2024-03-27T20-47-49.093705.parquet
---
# Dataset Card for Evaluation run of Locutusque/OpenCerebrum-1.0-7b-SFT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/OpenCerebrum-1.0-7b-SFT](https://huggingface.co/Locutusque/OpenCerebrum-1.0-7b-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.0-7b-SFT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T20:47:49.093705](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.0-7b-SFT/blob/main/results_2024-03-27T20-47-49.093705.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6250740164302089,
"acc_stderr": 0.0326447970500135,
"acc_norm": 0.6301788256025125,
"acc_norm_stderr": 0.033308458068719876,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.41447253861290334,
"mc2_stderr": 0.014262997972832498
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.01447113339264247,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946712
},
"harness|hellaswag|10": {
"acc": 0.6271659032065325,
"acc_stderr": 0.00482570253392041,
"acc_norm": 0.8325034853614818,
"acc_norm_stderr": 0.0037265541293484703
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322605,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315525,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150023,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150023
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295838,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295838
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913915,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913915
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.01594930879023364,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.01594930879023364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358978,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358978
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.02881472242225418,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.02881472242225418
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675602,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675602
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616915,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.41447253861290334,
"mc2_stderr": 0.014262997972832498
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987736
},
"harness|gsm8k|5": {
"acc": 0.39423805913570886,
"acc_stderr": 0.01346085235709565
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tomaarsen/conll2002 | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- es
- nl
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
- part-of-speech
paperswithcode_id: conll-2002
pretty_name: CoNLL-2002
config_names:
- es
- nl
dataset_info:
- config_name: es
features:
- name: id
dtype: string
- name: document_id
dtype: int32
- name: sentence_id
dtype: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': AO
'1': AQ
'2': CC
'3': CS
'4': DA
'5': DE
'6': DD
'7': DI
'8': DN
'9': DP
'10': DT
'11': Faa
'12': Fat
'13': Fc
'14': Fd
'15': Fe
'16': Fg
'17': Fh
'18': Fia
'19': Fit
'20': Fp
'21': Fpa
'22': Fpt
'23': Fs
'24': Ft
'25': Fx
'26': Fz
'27': I
'28': NC
'29': NP
'30': P0
'31': PD
'32': PI
'33': PN
'34': PP
'35': PR
'36': PT
'37': PX
'38': RG
'39': RN
'40': SP
'41': VAI
'42': VAM
'43': VAN
'44': VAP
'45': VAS
'46': VMG
'47': VMI
'48': VMM
'49': VMN
'50': VMP
'51': VMS
'52': VSG
'53': VSI
'54': VSM
'55': VSN
'56': VSP
'57': VSS
'58': Y
'59': Z
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 6738717
num_examples: 8323
- name: validation
num_bytes: 1349064
num_examples: 1915
- name: test
num_bytes: 1306252
num_examples: 1517
download_size: 4140690
dataset_size: 9394033
- config_name: nl
features:
- name: id
dtype: string
- name: document_id
dtype: int32
- name: sentence_id
dtype: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': Adj
'1': Adv
'2': Art
'3': Conj
'4': Int
'5': Misc
'6': N
'7': Num
'8': Prep
'9': Pron
'10': Punc
'11': V
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 5435346
num_examples: 15806
- name: validation
num_bytes: 1017418
num_examples: 2895
- name: test
num_bytes: 1850382
num_examples: 5195
download_size: 3642241
dataset_size: 8303146
---
# Dataset Card for CoNLL-2002
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [homepage](https://www.clips.uantwerpen.be/conll2002/ner/)
- **Repository:** [github](https://github.com/teropa/nlp/tree/master/resources/corpora/conll2002)
- **Paper:** [paper](https://www.aclweb.org/anthology/W02-2024/)
- **Point of Contact:** [Erik Tjong Kim Sang](erikt@uia.ua.ac.be)
### Dataset Summary
Named entities are phrases that contain the names of persons, organizations, locations, times and quantities. Example:
[PER Wolff] , currently a journalist in [LOC Argentina] , played with [PER Del Bosque] in the final years of the seventies in [ORG Real Madrid] .
The shared task of CoNLL-2002 concerns language-independent named entity recognition. We will concentrate on four types of named entities: persons, locations, organizations and names of miscellaneous entities that do not belong to the previous three groups. The participants of the shared task will be offered training and test data for at least two languages. They will use the data for developing a named-entity recognition system that includes a machine learning component. Information sources other than the training data may be used in this shared task. We are especially interested in methods that can use additional unannotated data for improving their performance (for example co-training).
### Supported Tasks and Leaderboards
Named Entity Recognition (NER) is a subtask of Information Extraction. Different NER systems were evaluated as a part of the Sixth Message Understanding Conference in 1995 (MUC6). The target language was English. The participating systems performed well. However, many of them used language-specific resources for performing the task and it is unknown how they would have performed on another language than English.
After 1995 NER systems have been developed for some European languages and a few Asian languages. There have been at least two studies that have applied one NER system to different languages. Palmer and Day [PD97] have used statistical methods for finding named entities in newswire articles in Chinese, English, French, Japanese, Portuguese and Spanish. They found that the difficulty of the NER task was different for the six languages but that a large part of the task could be performed with simple methods. Cucerzan and Yarowsky [CY99] used both morphological and contextual clues for identifying named entities in English, Greek, Hindi, Rumanian and Turkish. With minimal supervision, they obtained overall F measures between 40 and 70, depending on the languages used.
- `named-entity-recognition`: The performance in this task is measured with [F1](https://huggingface.co/metrics/f1) (higher is better). A named entity is correct only if it is an exact match of the corresponding entity in the data.
- `parsing`: The performance in this task is measured with [F1](https://huggingface.co/metrics/f1) (higher is better). A part-of-speech tag is correct only if it is equal to the corresponding tag in the data.
### Languages
There are two languages available : Spanish (es) and Dutch (nl).
## Dataset Structure
### Data Instances
The examples look like this :
```
{
'id': '0',
'document_id': 0,
'sentence_id': 0,
'tokens': ['Melbourne', '(', 'Australia', ')', ',', '25', 'may', '(', 'EFE', ')', '.'],
'pos_tags': [29, 21, 29, 22, 13, 59, 28, 21, 28, 22, 20],
'ner_tags': [5, 0, 5, 0, 0, 0, 0, 0, 3, 0, 0]
}
```
The original data files within the Dutch sub-dataset have `-DOCSTART-` lines used to separate documents, but these lines are removed here.
Indeed `-DOCSTART-` is a special line that acts as a boundary between two different documents, and it is filtered out in this implementation.
### Data Fields
- `id`: id of the sample
- `document_id`: an `int32` feature tracking which document the sample is from.
- `sentence_id`: an `int32` feature tracking which sentence in this document the sample is from.
- `tokens`: the tokens of the example text
- `ner_tags`: the NER tags of each token
- `pos_tags`: the POS tags of each token
The POS tags correspond to this list for Spanish:
```
'AO', 'AQ', 'CC', 'CS', 'DA', 'DE', 'DD', 'DI', 'DN', 'DP', 'DT', 'Faa', 'Fat', 'Fc', 'Fd', 'Fe', 'Fg', 'Fh', 'Fia', 'Fit', 'Fp', 'Fpa', 'Fpt', 'Fs', 'Ft', 'Fx', 'Fz', 'I', 'NC', 'NP', 'P0', 'PD', 'PI', 'PN', 'PP', 'PR', 'PT', 'PX', 'RG', 'RN', 'SP', 'VAI', 'VAM', 'VAN', 'VAP', 'VAS', 'VMG', 'VMI', 'VMM', 'VMN', 'VMP', 'VMS', 'VSG', 'VSI', 'VSM', 'VSN', 'VSP', 'VSS', 'Y', 'Z'
```
And this list for Dutch:
```
'Adj', 'Adv', 'Art', 'Conj', 'Int', 'Misc', 'N', 'Num', 'Prep', 'Pron', 'Punc', 'V'
```
The NER tags correspond to this list:
```
"O", "B-PER", "I-PER", "B-ORG", "I-ORG", "B-LOC", "I-LOC", "B-MISC", "I-MISC",
```
The NER tags have the same format as in the chunking task: a B denotes the first item of a phrase and an I any non-initial word. There are four types of phrases: person names (PER), organizations (ORG), locations (LOC) and miscellaneous names (MISC).
It is assumed that named entities are non-recursive and non-overlapping. In case a named entity is embedded in another named entity usually, only the top level entity is marked.
### Data Splits
For both configurations (Spanish and Dutch), there are three splits.
The original splits were named `train`, `testa` and `testb` and they correspond to the `train`, `validation` and `test` splits.
The splits have the following sizes :
| | train | validation | test |
| ----- |-------:|------------:|------:|
| N. Examples (Spanish) | 8324 | 1916 | 1518 |
| N. Examples (Dutch) | 15807 | 2896 | 5196 |
## Dataset Creation
### Curation Rationale
The dataset was introduced to introduce new resources to two languages that were under-served for statistical machine learning at the time, Dutch and Spanish.
[More Information Needed]
### Source Data
The Spanish data is a collection of news wire articles made available by the Spanish EFE News Agency. The articles are from May 2000.
The Dutch data consist of four editions of the Belgian newspaper "De Morgen" of 2000 (June 2, July 1, August 1 and September 1).
#### Initial Data Collection and Normalization
The articles were word-tokenized, information on the exact pre-processing pipeline is unavailable.
#### Who are the source language producers?
The source language was produced by journalists and writers employed by the news agency and newspaper mentioned above.
### Annotations
#### Annotation process
For the Dutch data, the annotator has followed the MITRE and SAIC guidelines for named entity recognition (Chinchor et al., 1999) as well as possible.
#### Who are the annotators?
The Spanish data annotation was carried out by the TALP Research Center of the Technical University of Catalonia (UPC) and the Center of Language and Computation (CLiC) of the University of Barcelona (UB).
The Dutch data was annotated as a part of the Atranos project at the University of Antwerp.
### Personal and Sensitive Information
The data is sourced from newspaper source and only contains mentions of public figures or individuals
## Considerations for Using the Data
### Social Impact of Dataset
Named Entity Recognition systems can be used to efficiently index news text, allowing to easily gather all information pertaining to an organization or individual. Making such resources widely available in languages other than English can support better research and user experience for a larger part of the world's population. At the same time, better indexing and discoverability can also enable surveillance by state actors.
### Discussion of Biases
News text reproduces the biases of society, and any system trained on news data should be cognizant of these limitations and the risk for models to learn spurious correlations in this context, for example between a person's gender and their occupation.
### Other Known Limitations
Users should keep in mind that the dataset only contains news text, which might limit the applicability of the developed systems to other domains.
## Additional Information
### Dataset Curators
The annotation of the Spanish data was funded by the European Commission through the NAMIC project (IST-1999-12392).
### Licensing Information
The licensing status of the data, especially the news source text, is unknown.
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@inproceedings{tjong-kim-sang-2002-introduction,
title = "Introduction to the {C}o{NLL}-2002 Shared Task: Language-Independent Named Entity Recognition",
author = "Tjong Kim Sang, Erik F.",
booktitle = "{COLING}-02: The 6th Conference on Natural Language Learning 2002 ({C}o{NLL}-2002)",
year = "2002",
url = "https://www.aclweb.org/anthology/W02-2024",
}
```
### Contributions
Thanks to [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
heliosprime/twitter_dataset_1713106441 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13450
num_examples: 39
download_size: 13705
dataset_size: 13450
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713106441"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/145_Hours_Spanish_Child_Spontaneous_Speech_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Spanish(spain) Children Real-world Casual Conversation and Monologue speech dataset, covers self-media, conversation, live, lecture, variety show and other generic domains, mirrors real-world interactions. Transcribed with text content, speaker's ID, gender, age, accent and other attributes. Our dataset was collected from extensive and diversify speakers(12 years old and younger children), geographicly speaking, enhancing model performance in real and complex tasks.rnQuality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1251?source=Huggingface
## Format
16kHz, 16 bit, wav, mono channel
## Age
12 years old and younger children
## Content category
including interview, self-meida,variety show, etc.
## Recording environment
Low background noise
## Country
Spain(ES)
## Language(Region) Code
es-ES
## Language
Spanish
## Features of annotation
Transcription text, timestamp, speaker ID, gender, noise
## Accuracy
Word Accuracy Rate (WAR) 95%
# Licensing Information
Commercial License
|
bakhitovd/ML_arxiv | ---
license: cc0-1.0
task_categories:
- summarization
language:
- en
pretty_name: ML Articles Subset of Scientific Papers
size_categories:
- 10K<n<100K
---
# Dataset Card for 'ML Articles Subset of Scientific Papers' Dataset
## Dataset Summary
The dataset consists of 32,621 instances from the 'Scientific papers' dataset, a selection of scientific papers and summaries from ArXiv repository. This subset focuses on articles that are semantically, vocabulary-wise, structurally, and meaningfully closest to articles describing machine learning. This subset was created using sentence embeddings and K-means clustering.
## Supported Tasks and Leaderboards
The dataset supports tasks related to text summarization. Particularly, the dataset was created for fine-tuning transformer models for summarization. There are no established leaderboards at this moment.
## Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
An instance in the dataset includes a scientific paper and its summary, both in English.
### Data Fields
article: The full text of the scientific paper.\
abstract: The summary of the paper.
### Data Splits
The dataset is split into:\
-training subset: 30280 articles\
-validation subset: 1196 articles\
-test subset: 1145 articles
## Dataset Creation
### Methods
The subset was created using sentence embeddings from a transformer model, SciBERT. The embeddings were clustered into 6 clusters using the K-means clustering algorithm. The cluster closest to articles strongly related to the machine learning area by cosine similarity was chosen to form this dataset.
### Source Data
The dataset is a subset of the 'Scientific papers' dataset, which includes scientific papers from the ArXiv repository.
### Social Impact
This dataset could help improve the quality of summarization models for machine learning research articles, which in turn can make such content more accessible.
### Discussion of Biases
As the dataset focuses on machine learning articles, it may not be representative of scientific papers in general or other specific domains.
### Other Known Limitations
As the dataset has been selected based on a specific methodology, it may not include all machine learning articles or may inadvertently include non-machine learning articles.
### Dataset Curators
The subset was created as part of a project aimed to build an effective summarization model for Machine Learning articles. |
justinwilloughby/mimarchive-all-MiniLM-L6-v2 | ---
license: mit
---
|
BiMediX/mmlu-professional_medicine-arabic | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 329003
num_examples: 272
download_size: 168257
dataset_size: 329003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kristmh/flutter_testset | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 5413073
num_examples: 2374
download_size: 1929122
dataset_size: 5413073
---
# Dataset Card for "flutter_testset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SEACrowd/jv_id_tts | ---
tags:
- text-to-speech
language:
- jav
---
# jv_id_tts
This data set contains high-quality transcribed audio data for Javanese.
The data set consists of wave files, and a TSV file.
The file line_index.tsv contains a filename and the transcription of audio in the file.
Each filename is prepended with a speaker identification number.
The data set has been manually quality checked, but there might still be errors.
This dataset was collected by Google in collaboration with Gadjah Mada University in Indonesia.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{sodimana18_sltu,
author={Keshan Sodimana and Pasindu {De Silva} and Supheakmungkol Sarin and Oddur Kjartansson and Martin Jansche and Knot Pipatsrisawat and Linne Ha},
title={{A Step-by-Step Process for Building TTS Voices Using Open Source Data and Frameworks for Bangla, Javanese, Khmer, Nepali, Sinhala, and Sundanese}},
year=2018,
booktitle={Proc. 6th Workshop on Spoken Language Technologies for Under-Resourced Languages (SLTU 2018)},
pages={66--70},
doi={10.21437/SLTU.2018-14}
}
```
## License
See https://www.openslr.org/resources/41/LICENSE file for license information. Attribution-ShareAlike 4.0 (CC BY-SA 4.0).
## Homepage
[http://openslr.org/41/](http://openslr.org/41/)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
robbo232323/gutenberg-block-from-next | ---
task_categories:
- text-generation
language:
- pl
pretty_name: Gutenberg Blocks from Next.js
size_categories:
- n<1K
--- |
fightfei/INFO-desc-llama2 | ---
dataset_info:
features:
- name: Subject Code
dtype: string
- name: Subject number
dtype: int64
- name: 'Unnamed: 2'
dtype: string
- name: Hours
dtype: string
splits:
- name: train
num_bytes: 2394.0
num_examples: 36
- name: test
num_bytes: 266.0
num_examples: 4
download_size: 6214
dataset_size: 2660.0
---
# Dataset Card for "INFO-desc-llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/massive_artificial_5pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 812439
num_examples: 11514
download_size: 258136
dataset_size: 812439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ctang/util_eval_llama2_v3 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: more_reasonable
dtype: string
splits:
- name: train
num_bytes: 1076712
num_examples: 4808
download_size: 450619
dataset_size: 1076712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MananSantoki/Vadodara-Info-Converted | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 97472
num_examples: 350
download_size: 38991
dataset_size: 97472
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Vadodara-Info-Converted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BarraHome__Mistroll-7B-v0.1-16bit | ---
pretty_name: Evaluation run of BarraHome/Mistroll-7B-v0.1-16bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BarraHome/Mistroll-7B-v0.1-16bit](https://huggingface.co/BarraHome/Mistroll-7B-v0.1-16bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__Mistroll-7B-v0.1-16bit\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T08:06:00.383239](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Mistroll-7B-v0.1-16bit/blob/main/results_2024-02-21T08-06-00.383239.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032184784518743,\n\
\ \"acc_stderr\": 0.03333730204729809,\n \"acc_norm\": 0.607891645213564,\n\
\ \"acc_norm_stderr\": 0.03401402537730786,\n \"mc1\": 0.5226438188494492,\n\
\ \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n\
\ \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464392,\n\
\ \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.0141696645203031\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6612228639713205,\n\
\ \"acc_stderr\": 0.004723266971563391,\n \"acc_norm\": 0.8481378211511651,\n\
\ \"acc_norm_stderr\": 0.0035815378475817935\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572277,\n \"\
acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572277\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.0251891498947642,\n \
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.0251891498947642\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335842,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335842\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534427,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534427\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5226438188494492,\n\
\ \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n\
\ \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827936\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3957543593631539,\n \
\ \"acc_stderr\": 0.013469823701048815\n }\n}\n```"
repo_url: https://huggingface.co/BarraHome/Mistroll-7B-v0.1-16bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|arc:challenge|25_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|gsm8k|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hellaswag|10_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T08-06-00.383239.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T08-06-00.383239.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- '**/details_harness|winogrande|5_2024-02-21T08-06-00.383239.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T08-06-00.383239.parquet'
- config_name: results
data_files:
- split: 2024_02_21T08_06_00.383239
path:
- results_2024-02-21T08-06-00.383239.parquet
- split: latest
path:
- results_2024-02-21T08-06-00.383239.parquet
---
# Dataset Card for Evaluation run of BarraHome/Mistroll-7B-v0.1-16bit
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarraHome/Mistroll-7B-v0.1-16bit](https://huggingface.co/BarraHome/Mistroll-7B-v0.1-16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarraHome__Mistroll-7B-v0.1-16bit",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T08:06:00.383239](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Mistroll-7B-v0.1-16bit/blob/main/results_2024-02-21T08-06-00.383239.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6032184784518743,
"acc_stderr": 0.03333730204729809,
"acc_norm": 0.607891645213564,
"acc_norm_stderr": 0.03401402537730786,
"mc1": 0.5226438188494492,
"mc1_stderr": 0.01748554225848964,
"mc2": 0.6766513448639357,
"mc2_stderr": 0.015264009667659464
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464392,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.0141696645203031
},
"harness|hellaswag|10": {
"acc": 0.6612228639713205,
"acc_stderr": 0.004723266971563391,
"acc_norm": 0.8481378211511651,
"acc_norm_stderr": 0.0035815378475817935
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.0251891498947642,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.0251891498947642
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335842,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335842
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.01594930879023364,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.01594930879023364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534427,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534427
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5226438188494492,
"mc1_stderr": 0.01748554225848964,
"mc2": 0.6766513448639357,
"mc2_stderr": 0.015264009667659464
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827936
},
"harness|gsm8k|5": {
"acc": 0.3957543593631539,
"acc_stderr": 0.013469823701048815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1 | ---
pretty_name: Evaluation run of hoskinson-center/proofGPT-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hoskinson-center/proofGPT-v0.1](https://huggingface.co/hoskinson-center/proofGPT-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T05:08:13.781124](https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1/blob/main/results_2023-10-24T05-08-13.781124.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0028313758389261743,\n\
\ \"em_stderr\": 0.0005441551135493826,\n \"f1\": 0.02285234899328862,\n\
\ \"f1_stderr\": 0.0009680329295901147,\n \"acc\": 0.25254955650911065,\n\
\ \"acc_stderr\": 0.007405053088899723\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135493826,\n\
\ \"f1\": 0.02285234899328862,\n \"f1_stderr\": 0.0009680329295901147\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225419\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.014051956064076903\n\
\ }\n}\n```"
repo_url: https://huggingface.co/hoskinson-center/proofGPT-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|arc:challenge|25_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T05_08_13.781124
path:
- '**/details_harness|drop|3_2023-10-24T05-08-13.781124.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T05-08-13.781124.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T05_08_13.781124
path:
- '**/details_harness|gsm8k|5_2023-10-24T05-08-13.781124.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T05-08-13.781124.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hellaswag|10_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T06-51-58.783827.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T06-51-58.783827.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T05_08_13.781124
path:
- '**/details_harness|winogrande|5_2023-10-24T05-08-13.781124.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T05-08-13.781124.parquet'
- config_name: results
data_files:
- split: 2023_10_04T06_51_58.783827
path:
- results_2023-10-04T06-51-58.783827.parquet
- split: 2023_10_24T05_08_13.781124
path:
- results_2023-10-24T05-08-13.781124.parquet
- split: latest
path:
- results_2023-10-24T05-08-13.781124.parquet
---
# Dataset Card for Evaluation run of hoskinson-center/proofGPT-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hoskinson-center/proofGPT-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hoskinson-center/proofGPT-v0.1](https://huggingface.co/hoskinson-center/proofGPT-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T05:08:13.781124](https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1/blob/main/results_2023-10-24T05-08-13.781124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493826,
"f1": 0.02285234899328862,
"f1_stderr": 0.0009680329295901147,
"acc": 0.25254955650911065,
"acc_stderr": 0.007405053088899723
},
"harness|drop|3": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493826,
"f1": 0.02285234899328862,
"f1_stderr": 0.0009680329295901147
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225419
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076903
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Zayt/extracted-vi-wiki-20230820 | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 492261222.63576823
num_examples: 395032
download_size: 663150112
dataset_size: 492261222.63576823
---
# Dataset Card for "extracted-vi-wiki-20230820"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Edopangui/promo_parquet | ---
license: apache-2.0
---
|
CyberHarem/lavie_lapisrelights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Lavie (Lapis Re:LiGHTs)
This is the dataset of Lavie (Lapis Re:LiGHTs), containing 154 images and their tags.
The core tags of this character are `blonde_hair, twintails, bow, hair_bow, green_eyes, bangs, red_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 154 | 93.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lavie_lapisrelights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 154 | 77.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lavie_lapisrelights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 306 | 148.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lavie_lapisrelights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 154 | 93.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lavie_lapisrelights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 306 | 173.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lavie_lapisrelights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lavie_lapisrelights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, closed_mouth, sailor_collar, school_uniform, solo, blush, hair_between_eyes, portrait, smile, anime_coloring, collarbone, looking_at_viewer, outdoors, upper_body, white_shirt |
| 1 | 20 |  |  |  |  |  | 1girl, solo, breasts, puffy_short_sleeves, upper_body, closed_mouth, blush, dress, hair_between_eyes, neck_ribbon, blue_shirt, looking_at_viewer, blurry_background, indoors, outdoors, striped_bow |
| 2 | 21 |  |  |  |  |  | 1girl, solo, fingerless_gloves, black_gloves, outdoors, smile, breasts, open_mouth, sleeveless_shirt, upper_body, blush, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | sailor_collar | school_uniform | solo | blush | hair_between_eyes | portrait | smile | anime_coloring | collarbone | looking_at_viewer | outdoors | upper_body | white_shirt | breasts | puffy_short_sleeves | dress | neck_ribbon | blue_shirt | blurry_background | indoors | striped_bow | fingerless_gloves | black_gloves | open_mouth | sleeveless_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------------|:-----------------|:-------|:--------|:--------------------|:-----------|:--------|:-----------------|:-------------|:--------------------|:-----------|:-------------|:--------------|:----------|:----------------------|:--------|:--------------|:-------------|:--------------------|:----------|:--------------|:--------------------|:---------------|:-------------|:-------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 20 |  |  |  |  |  | X | X | | | X | X | X | | | | | X | X | X | | X | X | X | X | X | X | X | X | | | | |
| 2 | 21 |  |  |  |  |  | X | | | | X | X | | | X | | | X | X | X | | X | | | | | | | | X | X | X | X |
|
bigIR/ar_cov19 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- ar
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- other
task_ids: []
paperswithcode_id: arcov-19
pretty_name: ArCOV19
tags:
- data-mining
dataset_info:
config_name: ar_cov19
features:
- name: tweetID
dtype: string
splits:
- name: train
num_bytes: 72223634
num_examples: 3140158
download_size: 23678407
dataset_size: 72223634
---
# Dataset Card for ArCOV19
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://gitlab.com/bigirqu/ArCOV-19
- **Paper:** [ArCOV-19: The First Arabic COVID-19 Twitter Dataset with Propagation Networks](https://arxiv.org/abs/2004.05861)
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [Fatima Haouari](mailto:200159617@qu.edu.qa)
### Dataset Summary
ArCOV-19 is an Arabic COVID-19 Twitter dataset that covers the period from 27th of January till 5th of May 2021.
ArCOV-19 is the first publicly-available Arabic Twitter dataset covering COVID-19 pandemic that includes about 3.2M
tweets alongside the propagation networks of the most-popular subset of them (i.e., most-retweeted and-liked).
The propagation networks include both retweets and conversational threads (i.e., threads of replies).
ArCOV-19 is designed to enable research under several domains including natural language processing, information
retrieval, and social computing, among others. Preliminary analysis shows that ArCOV-19 captures rising discussions
associated with the first reported cases of the disease as they appeared in the Arab world. In addition to the source
tweets and the propagation networks, we also release the search queries and the language-independent crawler used to
collect the tweets to encourage the curation of similar datasets.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Arabic
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
tweet_id: the Twitter assigned ID for the tweet object.
### Data Splits
[More Information Needed]
## Dataset Creation
The dataset collection approach is presented in the following paper: [ArCOV-19: The First Arabic COVID-19 Twitter Dataset with Propagation Networks](https://arxiv.org/abs/2004.05861)
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
No annotation was provided with the dataset.
#### Annotation process
No annotation was provided with the dataset.
#### Who are the annotators?
No annotation was provided with the dataset.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
**Team:** [bigIR](https://sites.google.com/view/bigir) from Qatar University ([@bigIR_group](https://twitter.com/bigIR_group))
- [Fatima Haouari](mailto:200159617@qu.edu.qa)
- [Maram Hasanain](mailto:maram.hasanain@qu.edu.qa)
- [Reem Suwaileh](mailto:rs081123@qu.edu.qa)
- [Dr. Tamer Elsayed](mailto:telsayed@qu.edu.qa)
### Licensing Information
[More Information Needed]
### Citation Information
```
@article{haouari2020arcov19,
title={ArCOV-19: The First Arabic COVID-19 Twitter Dataset with Propagation Networks},
author={Fatima Haouari and Maram Hasanain and Reem Suwaileh and Tamer Elsayed},
year={2021},
eprint={2004.05861},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@Fatima-Haouari](https://github.com/Fatima-Haouari) for adding this dataset. |
CyberHarem/elaice_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of elaice/イレース (Fire Emblem)
This is the dataset of elaice/イレース (Fire Emblem), containing 138 images and their tags.
The core tags of this character are `purple_hair, long_hair, purple_eyes, twintails, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 138 | 111.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 138 | 77.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 258 | 141.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 138 | 103.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 258 | 175.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elaice_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, hetero, multiple_penises, solo_focus, mosaic_censoring, gangbang, nipples, vaginal, blush, cum_in_pussy, medium_breasts, 2boys, 3boys, circlet, facial, fellatio, handjob, testicles |
| 1 | 5 |  |  |  |  |  | 1girl, nipples, solo, medium_breasts, open_mouth, blush, completely_nude, navel, artist_name, circlet, food, hair_flower, large_breasts, looking_at_viewer, pussy, signature, simple_background, sitting |
| 2 | 8 |  |  |  |  |  | 1girl, cape, circlet, skirt, solo, low-tied_long_hair, book, simple_background, sitting, white_background |
| 3 | 7 |  |  |  |  |  | 1girl, circlet, full_body, short_sleeves, simple_background, solo, bangs, capelet, hood_down, low_twintails, white_footwear, miniskirt, shiny_hair, white_background, belt_pouch, closed_mouth, purple_skirt, holding_book, jewelry, knee_boots, looking_at_viewer, magic, open_book |
| 4 | 9 |  |  |  |  |  | 1girl, alternate_costume, solo, candy, halloween_costume, holding, long_sleeves, circlet, dress, simple_background, cape, open_mouth, white_background, white_pantyhose, boots, eating, looking_at_viewer, purple_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | multiple_penises | solo_focus | mosaic_censoring | gangbang | nipples | vaginal | blush | cum_in_pussy | medium_breasts | 2boys | 3boys | circlet | facial | fellatio | handjob | testicles | solo | open_mouth | completely_nude | navel | artist_name | food | hair_flower | large_breasts | looking_at_viewer | pussy | signature | simple_background | sitting | cape | skirt | low-tied_long_hair | book | white_background | full_body | short_sleeves | bangs | capelet | hood_down | low_twintails | white_footwear | miniskirt | shiny_hair | belt_pouch | closed_mouth | purple_skirt | holding_book | jewelry | knee_boots | magic | open_book | alternate_costume | candy | halloween_costume | holding | long_sleeves | dress | white_pantyhose | boots | eating | purple_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------------|:-------------|:-------------------|:-----------|:----------|:----------|:--------|:---------------|:-----------------|:--------|:--------|:----------|:---------|:-----------|:----------|:------------|:-------|:-------------|:------------------|:--------|:--------------|:-------|:--------------|:----------------|:--------------------|:--------|:------------|:--------------------|:----------|:-------|:--------|:---------------------|:-------|:-------------------|:------------|:----------------|:--------|:----------|:------------|:----------------|:-----------------|:------------|:-------------|:-------------|:---------------|:---------------|:---------------|:----------|:-------------|:--------|:------------|:--------------------|:--------|:--------------------|:----------|:---------------|:--------|:------------------|:--------|:---------|:----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | | | X | | X | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | | | | | | | | | X | | | | | X | | | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | | | | | | | | | | X | | | | | X | X | | | | | | | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
alvarobartt/Anthropic_HH_Golden_Extended | ---
tags:
- not-for-all-audiences
dataset_info:
features:
- name: prompt_id
dtype: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 128690951
num_examples: 85074
- name: test
num_bytes: 7201288
num_examples: 4624
download_size: 44628148
dataset_size: 135892239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: apache-2.0
task_categories:
- conversational
language:
- en
size_categories:
- 10K<n<100K
--- |
Atipico1/NQ-30k | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 99959994.19960193
num_examples: 30000
- name: test
num_bytes: 12097860
num_examples: 3610
download_size: 66498219
dataset_size: 112057854.19960193
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
todi1/jjasmr | ---
license: openrail
---
|
mariosasko/glue | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- acceptability-classification
- natural-language-inference
- semantic-similarity-scoring
- sentiment-classification
- text-scoring
paperswithcode_id: glue
pretty_name: GLUE (General Language Understanding Evaluation benchmark)
train-eval-index:
- config: cola
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: sst2
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: mrpc
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: qqp
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question1: text1
question2: text2
label: target
- config: stsb
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: mnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation_matched
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_mismatched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_matched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: qnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question: text1
sentence: text2
label: target
- config: rte
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: wnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
configs:
- ax
- cola
- mnli
- mnli_matched
- mnli_mismatched
- mrpc
- qnli
- qqp
- rte
- sst2
- stsb
- wnli
tags:
- qa-nli
- coreference-nli
- paraphrase-identification
dataset_info:
- config_name: cola
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
0: unacceptable
1: acceptable
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 61049
num_examples: 1063
- name: train
num_bytes: 489149
num_examples: 8551
- name: validation
num_bytes: 60850
num_examples: 1043
download_size: 376971
dataset_size: 611048
- config_name: sst2
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
0: negative
1: positive
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 217556
num_examples: 1821
- name: train
num_bytes: 4715283
num_examples: 67349
- name: validation
num_bytes: 106692
num_examples: 872
download_size: 7439277
dataset_size: 5039531
- config_name: mrpc
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
0: not_equivalent
1: equivalent
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 443498
num_examples: 1725
- name: train
num_bytes: 946146
num_examples: 3668
- name: validation
num_bytes: 106142
num_examples: 408
download_size: 1494541
dataset_size: 1495786
- config_name: qqp
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype:
class_label:
names:
0: not_duplicate
1: duplicate
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 50901116
num_examples: 363846
- name: validation
num_bytes: 5653794
num_examples: 40430
- name: test
num_bytes: 55171431
num_examples: 390965
download_size: 41696084
dataset_size: 111726341
- config_name: stsb
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float32
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 170847
num_examples: 1379
- name: train
num_bytes: 758394
num_examples: 5749
- name: validation
num_bytes: 217012
num_examples: 1500
download_size: 802872
dataset_size: 1146253
- config_name: mnli
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
0: entailment
1: neutral
2: contradiction
- name: idx
dtype: int32
splits:
- name: test_matched
num_bytes: 1854787
num_examples: 9796
- name: test_mismatched
num_bytes: 1956866
num_examples: 9847
- name: train
num_bytes: 74865118
num_examples: 392702
- name: validation_matched
num_bytes: 1839926
num_examples: 9815
- name: validation_mismatched
num_bytes: 1955384
num_examples: 9832
download_size: 312783507
dataset_size: 82472081
- config_name: mnli_mismatched
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
0: entailment
1: neutral
2: contradiction
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 1956866
num_examples: 9847
- name: validation
num_bytes: 1955384
num_examples: 9832
download_size: 312783507
dataset_size: 3912250
- config_name: mnli_matched
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
0: entailment
1: neutral
2: contradiction
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 1854787
num_examples: 9796
- name: validation
num_bytes: 1839926
num_examples: 9815
download_size: 312783507
dataset_size: 3694713
- config_name: qnli
features:
- name: question
dtype: string
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
0: entailment
1: not_entailment
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 1376516
num_examples: 5463
- name: train
num_bytes: 25677924
num_examples: 104743
- name: validation
num_bytes: 1371727
num_examples: 5463
download_size: 10627589
dataset_size: 28426167
- config_name: rte
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
0: entailment
1: not_entailment
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 975936
num_examples: 3000
- name: train
num_bytes: 848888
num_examples: 2490
- name: validation
num_bytes: 90911
num_examples: 277
download_size: 697150
dataset_size: 1915735
- config_name: wnli
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
0: not_entailment
1: entailment
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 37992
num_examples: 146
- name: train
num_bytes: 107517
num_examples: 635
- name: validation
num_bytes: 12215
num_examples: 71
download_size: 28999
dataset_size: 157724
- config_name: ax
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
0: entailment
1: neutral
2: contradiction
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 238392
num_examples: 1104
download_size: 222257
dataset_size: 238392
---
# Dataset Card for GLUE
## Table of Contents
- [Dataset Card for GLUE](#dataset-card-for-glue)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [ax](#ax)
- [cola](#cola)
- [mnli](#mnli)
- [mnli_matched](#mnli_matched)
- [mnli_mismatched](#mnli_mismatched)
- [mrpc](#mrpc)
- [qnli](#qnli)
- [qqp](#qqp)
- [rte](#rte)
- [sst2](#sst2)
- [stsb](#stsb)
- [wnli](#wnli)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [ax](#ax-1)
- [cola](#cola-1)
- [mnli](#mnli-1)
- [mnli_matched](#mnli_matched-1)
- [mnli_mismatched](#mnli_mismatched-1)
- [mrpc](#mrpc-1)
- [qnli](#qnli-1)
- [qqp](#qqp-1)
- [rte](#rte-1)
- [sst2](#sst2-1)
- [stsb](#stsb-1)
- [wnli](#wnli-1)
- [Data Fields](#data-fields)
- [ax](#ax-2)
- [cola](#cola-2)
- [mnli](#mnli-2)
- [mnli_matched](#mnli_matched-2)
- [mnli_mismatched](#mnli_mismatched-2)
- [mrpc](#mrpc-2)
- [qnli](#qnli-2)
- [qqp](#qqp-2)
- [rte](#rte-2)
- [sst2](#sst2-2)
- [stsb](#stsb-2)
- [wnli](#wnli-2)
- [Data Splits](#data-splits)
- [ax](#ax-3)
- [cola](#cola-3)
- [mnli](#mnli-3)
- [mnli_matched](#mnli_matched-3)
- [mnli_mismatched](#mnli_mismatched-3)
- [mrpc](#mrpc-3)
- [qnli](#qnli-3)
- [qqp](#qqp-3)
- [rte](#rte-3)
- [sst2](#sst2-3)
- [stsb](#stsb-3)
- [wnli](#wnli-3)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://nyu-mll.github.io/CoLA/](https://nyu-mll.github.io/CoLA/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 955.33 MB
- **Size of the generated dataset:** 229.68 MB
- **Total amount of disk used:** 1185.01 MB
### Dataset Summary
GLUE, the General Language Understanding Evaluation benchmark (https://gluebenchmark.com/) is a collection of resources for training, evaluating, and analyzing natural language understanding systems.
### Supported Tasks and Leaderboards
The leaderboard for the GLUE benchmark can be found [at this address](https://gluebenchmark.com/). It comprises the following tasks:
#### ax
A manually-curated evaluation dataset for fine-grained analysis of system performance on a broad range of linguistic phenomena. This dataset evaluates sentence understanding through Natural Language Inference (NLI) problems. Use a model trained on MulitNLI to produce predictions for this dataset.
#### cola
The Corpus of Linguistic Acceptability consists of English acceptability judgments drawn from books and journal articles on linguistic theory. Each example is a sequence of words annotated with whether it is a grammatical English sentence.
#### mnli
The Multi-Genre Natural Language Inference Corpus is a crowdsourced collection of sentence pairs with textual entailment annotations. Given a premise sentence and a hypothesis sentence, the task is to predict whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral). The premise sentences are gathered from ten different sources, including transcribed speech, fiction, and government reports. The authors of the benchmark use the standard test set, for which they obtained private labels from the RTE authors, and evaluate on both the matched (in-domain) and mismatched (cross-domain) section. They also uses and recommend the SNLI corpus as 550k examples of auxiliary training data.
#### mnli_matched
The matched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mnli_mismatched
The mismatched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mrpc
The Microsoft Research Paraphrase Corpus (Dolan & Brockett, 2005) is a corpus of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
#### qnli
The Stanford Question Answering Dataset is a question-answering dataset consisting of question-paragraph pairs, where one of the sentences in the paragraph (drawn from Wikipedia) contains the answer to the corresponding question (written by an annotator). The authors of the benchmark convert the task into sentence pair classification by forming a pair between each question and each sentence in the corresponding context, and filtering out pairs with low lexical overlap between the question and the context sentence. The task is to determine whether the context sentence contains the answer to the question. This modified version of the original task removes the requirement that the model select the exact answer, but also removes the simplifying assumptions that the answer is always present in the input and that lexical overlap is a reliable cue.
#### qqp
The Quora Question Pairs2 dataset is a collection of question pairs from the community question-answering website Quora. The task is to determine whether a pair of questions are semantically equivalent.
#### rte
The Recognizing Textual Entailment (RTE) datasets come from a series of annual textual entailment challenges. The authors of the benchmark combined the data from RTE1 (Dagan et al., 2006), RTE2 (Bar Haim et al., 2006), RTE3 (Giampiccolo et al., 2007), and RTE5 (Bentivogli et al., 2009). Examples are constructed based on news and Wikipedia text. The authors of the benchmark convert all datasets to a two-class split, where for three-class datasets they collapse neutral and contradiction into not entailment, for consistency.
#### sst2
The Stanford Sentiment Treebank consists of sentences from movie reviews and human annotations of their sentiment. The task is to predict the sentiment of a given sentence. It uses the two-way (positive/negative) class split, with only sentence-level labels.
#### stsb
The Semantic Textual Similarity Benchmark (Cer et al., 2017) is a collection of sentence pairs drawn from news headlines, video and image captions, and natural language inference data. Each pair is human-annotated with a similarity score from 1 to 5.
#### wnli
The Winograd Schema Challenge (Levesque et al., 2011) is a reading comprehension task in which a system must read a sentence with a pronoun and select the referent of that pronoun from a list of choices. The examples are manually constructed to foil simple statistical methods: Each one is contingent on contextual information provided by a single word or phrase in the sentence. To convert the problem into sentence pair classification, the authors of the benchmark construct sentence pairs by replacing the ambiguous pronoun with each possible referent. The task is to predict if the sentence with the pronoun substituted is entailed by the original sentence. They use a small evaluation set consisting of new examples derived from fiction books that was shared privately by the authors of the original corpus. While the included training set is balanced between two classes, the test set is imbalanced between them (65% not entailment). Also, due to a data quirk, the development set is adversarial: hypotheses are sometimes shared between training and development examples, so if a model memorizes the training examples, they will predict the wrong label on corresponding development set example. As with QNLI, each example is evaluated separately, so there is not a systematic correspondence between a model's score on this task and its score on the unconverted original task. The authors of the benchmark call converted dataset WNLI (Winograd NLI).
### Languages
The language data in GLUE is in English (BCP-47 `en`)
## Dataset Structure
### Data Instances
#### ax
- **Size of downloaded dataset files:** 0.21 MB
- **Size of the generated dataset:** 0.23 MB
- **Total amount of disk used:** 0.44 MB
An example of 'test' looks as follows.
```
{
"premise": "The cat sat on the mat.",
"hypothesis": "The cat did not sit on the mat.",
"label": -1,
"idx: 0
}
```
#### cola
- **Size of downloaded dataset files:** 0.36 MB
- **Size of the generated dataset:** 0.58 MB
- **Total amount of disk used:** 0.94 MB
An example of 'train' looks as follows.
```
{
"sentence": "Our friends won't buy this analysis, let alone the next one we propose.",
"label": 1,
"id": 0
}
```
#### mnli
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 78.65 MB
- **Total amount of disk used:** 376.95 MB
An example of 'train' looks as follows.
```
{
"premise": "Conceptually cream skimming has two basic dimensions - product and geography.",
"hypothesis": "Product and geography are what make cream skimming work.",
"label": 1,
"idx": 0
}
```
#### mnli_matched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.52 MB
- **Total amount of disk used:** 301.82 MB
An example of 'test' looks as follows.
```
{
"premise": "Hierbas, ans seco, ans dulce, and frigola are just a few names worth keeping a look-out for.",
"hypothesis": "Hierbas is a name worth looking out for.",
"label": -1,
"idx": 0
}
```
#### mnli_mismatched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.73 MB
- **Total amount of disk used:** 302.02 MB
An example of 'test' looks as follows.
```
{
"premise": "What have you decided, what are you going to do?",
"hypothesis": "So what's your decision?,
"label": -1,
"idx": 0
}
```
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
#### ax
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### cola
- `sentence`: a `string` feature.
- `label`: a classification label, with possible values including `unacceptable` (0), `acceptable` (1).
- `idx`: a `int32` feature.
#### mnli
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_matched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_mismatched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Splits
#### ax
| |test|
|---|---:|
|ax |1104|
#### cola
| |train|validation|test|
|----|----:|---------:|---:|
|cola| 8551| 1043|1063|
#### mnli
| |train |validation_matched|validation_mismatched|test_matched|test_mismatched|
|----|-----:|-----------------:|--------------------:|-----------:|--------------:|
|mnli|392702| 9815| 9832| 9796| 9847|
#### mnli_matched
| |validation|test|
|------------|---------:|---:|
|mnli_matched| 9815|9796|
#### mnli_mismatched
| |validation|test|
|---------------|---------:|---:|
|mnli_mismatched| 9832|9847|
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{warstadt2018neural,
title={Neural Network Acceptability Judgments},
author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},
journal={arXiv preprint arXiv:1805.12471},
year={2018}
}
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
Note that each GLUE dataset has its own citation. Please see the source to see
the correct citation for each contained dataset.
```
### Contributions
Thanks to [@patpizio](https://github.com/patpizio), [@jeswan](https://github.com/jeswan), [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham) for adding this dataset. |
dotdotdidi/fine_tuning_datraset_4_openai | ---
license: apache-2.0
---
|
CyberHarem/momoi_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of momoi/才羽モモイ/桃井 (Blue Archive)
This is the dataset of momoi/才羽モモイ/桃井 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `blonde_hair, animal_ears, fake_animal_ears, animal_ear_headphones, headphones, short_hair, bow, halo, cat_ear_headphones, hair_bow, red_bow, pink_eyes, pink_halo, tail, cat_tail, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 690.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momoi_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 599.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momoi_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1284 | 1.24 GiB | [Download](https://huggingface.co/datasets/CyberHarem/momoi_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/momoi_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, collared_shirt, long_sleeves, pleated_skirt, solo, white_shirt, black_skirt, blue_necktie, looking_at_viewer, white_jacket, blush, open_mouth, white_background, simple_background, smile, wide_sleeves, black_thighhighs, suspender_skirt |
| 1 | 18 |  |  |  |  |  | 2girls, black_skirt, blue_necktie, collared_shirt, long_sleeves, sisters, white_jacket, white_shirt, open_mouth, twins, black_thighhighs, blush, pleated_skirt, looking_at_viewer, smile, suspenders, simple_background, solo_focus, white_background, wide_sleeves |
| 2 | 6 |  |  |  |  |  | 2girls, collared_shirt, long_sleeves, looking_at_viewer, sisters, upper_body, white_jacket, white_shirt, 1girl, blue_necktie, open_mouth, solo, twins, blush, simple_background, smile, white_background, wide_sleeves |
| 3 | 8 |  |  |  |  |  | 1girl, blue_necktie, blush, collared_shirt, solo, white_shirt, open_mouth, simple_background, white_background, jacket, looking_at_viewer, portrait, upper_body, smile |
| 4 | 6 |  |  |  |  |  | 1girl, blue_necktie, collared_shirt, long_sleeves, solo, upper_body, white_jacket, white_shirt, blush, simple_background, white_background, closed_mouth, smile, looking_at_viewer, wide_sleeves |
| 5 | 6 |  |  |  |  |  | 1girl, blush, long_sleeves, open_mouth, red_scarf, smile, solo, white_jacket, upper_body, looking_at_viewer |
| 6 | 8 |  |  |  |  |  | 2girls, black_dress, frilled_apron, maid_apron, maid_headdress, official_alternate_costume, sisters, white_apron, looking_at_viewer, open_mouth, smile, solo_focus, twins, blush, simple_background, twintails, white_pantyhose, frilled_dress, neck_ribbon, white_background, blue_bow, fake_tail, puffy_long_sleeves |
| 7 | 7 |  |  |  |  |  | 1girl, black_dress, black_footwear, frilled_apron, maid_apron, maid_headdress, official_alternate_costume, shoes, simple_background, solo, white_apron, white_background, white_pantyhose, blush, full_body, looking_at_viewer, open_mouth, smile, twintails, blue_bow, frilled_dress, standing, puffy_long_sleeves, holding_broom |
| 8 | 9 |  |  |  |  |  | blush, loli, nipples, 1boy, hetero, penis, twins, collarbone, open_mouth, solo_focus, 2girls, small_breasts, smile, completely_nude, pussy, sisters, 1girl, bar_censor, black_thighhighs, looking_at_viewer, mosaic_censoring, sex, vaginal |
| 9 | 5 |  |  |  |  |  | 2girls, blush, sisters, twins, looking_at_viewer, navel, open_mouth, small_breasts, smile, solo_focus, white_bikini, collarbone, flat_chest, micro_bikini, stomach, fake_tail, loli, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collared_shirt | long_sleeves | pleated_skirt | solo | white_shirt | black_skirt | blue_necktie | looking_at_viewer | white_jacket | blush | open_mouth | white_background | simple_background | smile | wide_sleeves | black_thighhighs | suspender_skirt | 2girls | sisters | twins | suspenders | solo_focus | upper_body | jacket | portrait | closed_mouth | red_scarf | black_dress | frilled_apron | maid_apron | maid_headdress | official_alternate_costume | white_apron | twintails | white_pantyhose | frilled_dress | neck_ribbon | blue_bow | fake_tail | puffy_long_sleeves | black_footwear | shoes | full_body | standing | holding_broom | loli | nipples | 1boy | hetero | penis | collarbone | small_breasts | completely_nude | pussy | bar_censor | mosaic_censoring | sex | vaginal | navel | white_bikini | flat_chest | micro_bikini | stomach |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:----------------|:-------|:--------------|:--------------|:---------------|:--------------------|:---------------|:--------|:-------------|:-------------------|:--------------------|:--------|:---------------|:-------------------|:------------------|:---------|:----------|:--------|:-------------|:-------------|:-------------|:---------|:-----------|:---------------|:------------|:--------------|:----------------|:-------------|:-----------------|:-----------------------------|:--------------|:------------|:------------------|:----------------|:--------------|:-----------|:------------|:---------------------|:-----------------|:--------|:------------|:-----------|:----------------|:-------|:----------|:-------|:---------|:--------|:-------------|:----------------|:------------------|:--------|:-------------|:-------------------|:------|:----------|:--------|:---------------|:-------------|:---------------|:----------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | | X | X | | X | X | | X | X | X | X | X | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | | X | X | X | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | X | | | | X | X | X | X | | | X | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | | | | | | | | | X | | X | X | X | X | X | | | | X | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | | | | | | | X | | X | X | | | X | | X | | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 9 | 5 |  |  |  |  |  | | | | | | | | | X | | X | X | X | X | X | | | | X | X | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | X | X | | | | | | | X | X | X | X | X |
|
korexyz/pokemon-blip-captions-embeddings | ---
dataset_info:
features:
- name: text_embedding
sequence: float32
- name: image_embedding
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 15800344
num_examples: 833
download_size: 16604690
dataset_size: 15800344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tsuinzues/rainbowdash | ---
license: openrail
---
|
open-llm-leaderboard/details_argilla__notus-7b-v1 | ---
pretty_name: Evaluation run of argilla/notus-7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [argilla/notus-7b-v1](https://huggingface.co/argilla/notus-7b-v1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_argilla__notus-7b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T17:15:53.519887](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notus-7b-v1/blob/main/results_2023-12-04T17-15-53.519887.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6284345225205253,\n\
\ \"acc_stderr\": 0.03266688541458245,\n \"acc_norm\": 0.6343199967908271,\n\
\ \"acc_norm_stderr\": 0.03333546965424883,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5434993224846835,\n\
\ \"mc2_stderr\": 0.01537768281733017\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.01425856388051378,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.01397545412275656\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6533559051981677,\n\
\ \"acc_stderr\": 0.004749286071559562,\n \"acc_norm\": 0.8483369846644094,\n\
\ \"acc_norm_stderr\": 0.0035796087435066106\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386424,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386424\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094764,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094764\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091122,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091122\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.0250093137900697,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.0250093137900697\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n\
\ \"acc_stderr\": 0.016643307372315872,\n \"acc_norm\": 0.45139664804469276,\n\
\ \"acc_norm_stderr\": 0.016643307372315872\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.012667701919603662,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.012667701919603662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724504,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.0279626776047689,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.0279626776047689\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5434993224846835,\n\
\ \"mc2_stderr\": 0.01537768281733017\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597207\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3457164518574678,\n \
\ \"acc_stderr\": 0.013100422990441573\n }\n}\n```"
repo_url: https://huggingface.co/argilla/notus-7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|arc:challenge|25_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|drop|3_2023-11-29T22-16-51.521321.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-29T22-16-51.521321.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|gsm8k|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hellaswag|10_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-29T22-16-51.521321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-15-53.519887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-15-53.519887.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- '**/details_harness|winogrande|5_2023-11-29T22-16-51.521321.parquet'
- split: 2023_12_04T17_15_53.519887
path:
- '**/details_harness|winogrande|5_2023-12-04T17-15-53.519887.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T17-15-53.519887.parquet'
- config_name: results
data_files:
- split: 2023_11_29T22_16_51.521321
path:
- results_2023-11-29T22-16-51.521321.parquet
- split: 2023_12_04T17_15_53.519887
path:
- results_2023-12-04T17-15-53.519887.parquet
- split: latest
path:
- results_2023-12-04T17-15-53.519887.parquet
---
# Dataset Card for Evaluation run of argilla/notus-7b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/argilla/notus-7b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [argilla/notus-7b-v1](https://huggingface.co/argilla/notus-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_argilla__notus-7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:15:53.519887](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notus-7b-v1/blob/main/results_2023-12-04T17-15-53.519887.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6284345225205253,
"acc_stderr": 0.03266688541458245,
"acc_norm": 0.6343199967908271,
"acc_norm_stderr": 0.03333546965424883,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5434993224846835,
"mc2_stderr": 0.01537768281733017
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.01397545412275656
},
"harness|hellaswag|10": {
"acc": 0.6533559051981677,
"acc_stderr": 0.004749286071559562,
"acc_norm": 0.8483369846644094,
"acc_norm_stderr": 0.0035796087435066106
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386424,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386424
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094764,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094764
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091122,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091122
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.0250093137900697,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.0250093137900697
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.016643307372315872,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.016643307372315872
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.012667701919603662,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.012667701919603662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724504,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.0279626776047689,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.0279626776047689
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5434993224846835,
"mc2_stderr": 0.01537768281733017
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597207
},
"harness|gsm8k|5": {
"acc": 0.3457164518574678,
"acc_stderr": 0.013100422990441573
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HanxuHU/mmmu_th | ---
dataset_info:
- config_name: Accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1608508.0
num_examples: 30
download_size: 1539948
dataset_size: 1608508.0
- config_name: Agriculture
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 119222088.0
num_examples: 30
download_size: 119225355
dataset_size: 119222088.0
- config_name: Architecture_and_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 730957.0
num_examples: 30
download_size: 730963
dataset_size: 730957.0
- config_name: Art
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 29938565.0
num_examples: 30
download_size: 29941296
dataset_size: 29938565.0
- config_name: Art_Theory
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 33483477.0
num_examples: 30
download_size: 29784730
dataset_size: 33483477.0
- config_name: Basic_Medical_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 4129143.0
num_examples: 30
download_size: 4136065
dataset_size: 4129143.0
- config_name: Biology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8499901.0
num_examples: 30
download_size: 8497039
dataset_size: 8499901.0
- config_name: Chemistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1525165.0
num_examples: 30
download_size: 1524411
dataset_size: 1525165.0
- config_name: Clinical_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 10891316.0
num_examples: 30
download_size: 10889174
dataset_size: 10891316.0
- config_name: Computer_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 2079428.0
num_examples: 30
download_size: 2081465
dataset_size: 2079428.0
- config_name: Design
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 17925837.0
num_examples: 30
download_size: 16228899
dataset_size: 17925837.0
- config_name: Diagnostics_and_Laboratory_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 37109598.0
num_examples: 30
download_size: 37090620
dataset_size: 37109598.0
- config_name: Economics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1494866.0
num_examples: 30
download_size: 1428595
dataset_size: 1494866.0
- config_name: Electronics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 644756.0
num_examples: 30
download_size: 645350
dataset_size: 644756.0
- config_name: Energy_and_Power
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1652711.0
num_examples: 30
download_size: 1651654
dataset_size: 1652711.0
- config_name: Finance
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1083786.0
num_examples: 30
download_size: 1010588
dataset_size: 1083786.0
- config_name: Geography
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 6676465.0
num_examples: 30
download_size: 6678327
dataset_size: 6676465.0
- config_name: History
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8824664.0
num_examples: 30
download_size: 8432451
dataset_size: 8824664.0
- config_name: Literature
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 14245622.0
num_examples: 30
download_size: 14248581
dataset_size: 14245622.0
- config_name: Manage
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 3297865.0
num_examples: 30
download_size: 3146540
dataset_size: 3297865.0
- config_name: Marketing
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1482390.0
num_examples: 30
download_size: 1365050
dataset_size: 1482390.0
- config_name: Materials
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 2311813.0
num_examples: 30
download_size: 2312357
dataset_size: 2311813.0
- config_name: Math
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1450496.0
num_examples: 30
download_size: 1451285
dataset_size: 1450496.0
- config_name: Mechanical_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 882721.0
num_examples: 30
download_size: 881837
dataset_size: 882721.0
- config_name: Music
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 9361424.0
num_examples: 30
download_size: 9364576
dataset_size: 9361424.0
- config_name: Pharmacy
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1662710.0
num_examples: 30
download_size: 1553400
dataset_size: 1662710.0
- config_name: Physics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1121984.0
num_examples: 30
download_size: 1120650
dataset_size: 1121984.0
- config_name: Psychology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 4436175.0
num_examples: 30
download_size: 4317851
dataset_size: 4436175.0
- config_name: Public_Health
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1525148.0
num_examples: 30
download_size: 1514003
dataset_size: 1525148.0
- config_name: Sociology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 18458525.0
num_examples: 30
download_size: 18461351
dataset_size: 18458525.0
configs:
- config_name: Accounting
data_files:
- split: validation
path: Accounting/validation-*
- config_name: Agriculture
data_files:
- split: validation
path: Agriculture/validation-*
- config_name: Architecture_and_Engineering
data_files:
- split: validation
path: Architecture_and_Engineering/validation-*
- config_name: Art
data_files:
- split: validation
path: Art/validation-*
- config_name: Art_Theory
data_files:
- split: validation
path: Art_Theory/validation-*
- config_name: Basic_Medical_Science
data_files:
- split: validation
path: Basic_Medical_Science/validation-*
- config_name: Biology
data_files:
- split: validation
path: Biology/validation-*
- config_name: Chemistry
data_files:
- split: validation
path: Chemistry/validation-*
- config_name: Clinical_Medicine
data_files:
- split: validation
path: Clinical_Medicine/validation-*
- config_name: Computer_Science
data_files:
- split: validation
path: Computer_Science/validation-*
- config_name: Design
data_files:
- split: validation
path: Design/validation-*
- config_name: Diagnostics_and_Laboratory_Medicine
data_files:
- split: validation
path: Diagnostics_and_Laboratory_Medicine/validation-*
- config_name: Economics
data_files:
- split: validation
path: Economics/validation-*
- config_name: Electronics
data_files:
- split: validation
path: Electronics/validation-*
- config_name: Energy_and_Power
data_files:
- split: validation
path: Energy_and_Power/validation-*
- config_name: Finance
data_files:
- split: validation
path: Finance/validation-*
- config_name: Geography
data_files:
- split: validation
path: Geography/validation-*
- config_name: History
data_files:
- split: validation
path: History/validation-*
- config_name: Literature
data_files:
- split: validation
path: Literature/validation-*
- config_name: Manage
data_files:
- split: validation
path: Manage/validation-*
- config_name: Marketing
data_files:
- split: validation
path: Marketing/validation-*
- config_name: Materials
data_files:
- split: validation
path: Materials/validation-*
- config_name: Math
data_files:
- split: validation
path: Math/validation-*
- config_name: Mechanical_Engineering
data_files:
- split: validation
path: Mechanical_Engineering/validation-*
- config_name: Music
data_files:
- split: validation
path: Music/validation-*
- config_name: Pharmacy
data_files:
- split: validation
path: Pharmacy/validation-*
- config_name: Physics
data_files:
- split: validation
path: Physics/validation-*
- config_name: Psychology
data_files:
- split: validation
path: Psychology/validation-*
- config_name: Public_Health
data_files:
- split: validation
path: Public_Health/validation-*
- config_name: Sociology
data_files:
- split: validation
path: Sociology/validation-*
---
|
Codec-SUPERB/esc50_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 16081006
num_examples: 2000
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 16081006
num_examples: 2000
- name: academicodec_hifi_24k_320d
num_bytes: 24081006
num_examples: 2000
- name: audiodec_24k_320d
num_bytes: 51441006
num_examples: 2000
- name: dac_16k
num_bytes: 98065006
num_examples: 2000
- name: dac_24k
num_bytes: 272177006
num_examples: 2000
- name: dac_44k
num_bytes: 83065006
num_examples: 2000
- name: encodec_24k
num_bytes: 12097006
num_examples: 2000
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 128817006
num_examples: 2000
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 128817006
num_examples: 2000
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 128305006
num_examples: 2000
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 64305006
num_examples: 2000
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 128305006
num_examples: 2000
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 64305006
num_examples: 2000
- name: speech_tokenizer_16k
num_bytes: 32113006
num_examples: 2000
download_size: 180540016
dataset_size: 1248055090
---
# Dataset Card for "esc50_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
helloelwin/gsm8k | ---
dataset_info:
- config_name: train_strong
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3963732.33614345
num_examples: 3737
download_size: 2296622
dataset_size: 3963732.33614345
- config_name: train_strong_1
features:
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: answer
dtype: string
- name: acc
dtype: float64
splits:
- name: train
num_bytes: 1547749.720096334
num_examples: 1868
download_size: 876300
dataset_size: 1547749.720096334
- config_name: train_strong_2
features:
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: answer
dtype: string
- name: acc
dtype: float64
splits:
- name: train
num_bytes: 1548578.279903666
num_examples: 1869
download_size: 878854
dataset_size: 1548578.279903666
- config_name: train_weak
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3962671.66385655
num_examples: 3736
download_size: 2318553
dataset_size: 3962671.66385655
configs:
- config_name: train_strong
data_files:
- split: train
path: train_strong/train-*
- config_name: train_strong_1
data_files:
- split: train
path: train_strong_1/train-*
- config_name: train_strong_2
data_files:
- split: train
path: train_strong_2/train-*
- config_name: train_weak
data_files:
- split: train
path: train_weak/train-*
---
|
0x22almostEvil/reasoning_bg_oa | ---
license: apache-2.0
task_categories:
- question-answering
language:
- bg
tags:
- QnA
- reasoning
size_categories:
- 1K<n<10K
---
# Dataset Card for Bulgarian QnA reasoning with ~2.7K entries.
### Dataset Summary
Contains Parquet of a list of instructions and answers.
Each row consists of
* INSTRUCTION
* RESPONSE
* SOURCE (reasoning_bg)
* METADATA (json with language, url, id).
### Original Dataset is available here:
* https://huggingface.co/datasets/reasoning_bg |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.