datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
irds/beir_nfcorpus_train | ---
pretty_name: '`beir/nfcorpus/train`'
viewer: false
source_datasets: ['irds/beir_nfcorpus']
task_categories:
- text-retrieval
---
# Dataset Card for `beir/nfcorpus/train`
The `beir/nfcorpus/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/nfcorpus/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=2,590
- `qrels`: (relevance assessments); count=110,575
- For `docs`, use [`irds/beir_nfcorpus`](https://huggingface.co/datasets/irds/beir_nfcorpus)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/beir_nfcorpus_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/beir_nfcorpus_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Boteva2016Nfcorpus,
title="A Full-Text Learning to Rank Dataset for Medical Information Retrieval",
author = "Vera Boteva and Demian Gholipour and Artem Sokolov and Stefan Riezler",
booktitle = "Proceedings of the European Conference on Information Retrieval ({ECIR})",
location = "Padova, Italy",
publisher = "Springer",
year = 2016
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
open-llm-leaderboard/details_saishf__West-Hermes-7B | ---
pretty_name: Evaluation run of saishf/West-Hermes-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [saishf/West-Hermes-7B](https://huggingface.co/saishf/West-Hermes-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__West-Hermes-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T21:42:28.166161](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Hermes-7B/blob/main/results_2024-02-09T21-42-28.166161.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538092062988495,\n\
\ \"acc_stderr\": 0.032084902797116045,\n \"acc_norm\": 0.6533253003223362,\n\
\ \"acc_norm_stderr\": 0.032757174003025594,\n \"mc1\": 0.4969400244798042,\n\
\ \"mc1_stderr\": 0.017503173260960618,\n \"mc2\": 0.6425676288822494,\n\
\ \"mc2_stderr\": 0.015503970191592676\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6911262798634812,\n \"acc_stderr\": 0.013501770929344003,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7055367456681936,\n\
\ \"acc_stderr\": 0.004548695749620959,\n \"acc_norm\": 0.8760207130053774,\n\
\ \"acc_norm_stderr\": 0.0032888439778712606\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933712,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933712\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853034,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853034\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n\
\ \"acc_stderr\": 0.016295332328155814,\n \"acc_norm\": 0.3877094972067039,\n\
\ \"acc_norm_stderr\": 0.016295332328155814\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4969400244798042,\n\
\ \"mc1_stderr\": 0.017503173260960618,\n \"mc2\": 0.6425676288822494,\n\
\ \"mc2_stderr\": 0.015503970191592676\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272951\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \
\ \"acc_stderr\": 0.012791037227336034\n }\n}\n```"
repo_url: https://huggingface.co/saishf/West-Hermes-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-42-28.166161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-42-28.166161.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- '**/details_harness|winogrande|5_2024-02-09T21-42-28.166161.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T21-42-28.166161.parquet'
- config_name: results
data_files:
- split: 2024_02_09T21_42_28.166161
path:
- results_2024-02-09T21-42-28.166161.parquet
- split: latest
path:
- results_2024-02-09T21-42-28.166161.parquet
---
# Dataset Card for Evaluation run of saishf/West-Hermes-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/West-Hermes-7B](https://huggingface.co/saishf/West-Hermes-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__West-Hermes-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T21:42:28.166161](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__West-Hermes-7B/blob/main/results_2024-02-09T21-42-28.166161.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538092062988495,
"acc_stderr": 0.032084902797116045,
"acc_norm": 0.6533253003223362,
"acc_norm_stderr": 0.032757174003025594,
"mc1": 0.4969400244798042,
"mc1_stderr": 0.017503173260960618,
"mc2": 0.6425676288822494,
"mc2_stderr": 0.015503970191592676
},
"harness|arc:challenge|25": {
"acc": 0.6911262798634812,
"acc_stderr": 0.013501770929344003,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7055367456681936,
"acc_stderr": 0.004548695749620959,
"acc_norm": 0.8760207130053774,
"acc_norm_stderr": 0.0032888439778712606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933712,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933712
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853034,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853034
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.016295332328155814,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.016295332328155814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4969400244798042,
"mc1_stderr": 0.017503173260960618,
"mc2": 0.6425676288822494,
"mc2_stderr": 0.015503970191592676
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272951
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.012791037227336034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mda/zipcode | ---
license: gpl-3.0
---
|
coastalcph/fair-rationales | ---
license: mit
language:
- en
annotations_creators:
- crowdsourced
source_datasets:
- extended
task_categories:
- text-classification
task_ids:
- sentiment-classification
- open-domain-qa
tags:
- bias
- fairness
- rationale
- demographic
pretty_name: FairRationales
---
# Dataset Card for "FairRationales"
## Dataset Summary
We present a new collection of annotations for a subset of CoS-E [[1]](#1), DynaSent [[2]](#2), and SST [[3]](#3)/Zuco [[4]](#4) with demographics-augmented annotations, balanced across age and ethnicity.
We asked participants to choose a label and then provide supporting evidence (rationales) based on the input sentence for their answer.
Existing rationale datasets are typically constructed by giving annotators 'gold standard' labels,
and having them provide rationales for these labels.
Instead, we let annotators provide rationales for labels they choose themselves. This lets them engage
in the decision process, but it also acknowledges
that annotators with different backgrounds may disagree on classification decisions. Explaining other
people’s choices is error-prone [[5]](#5), and we do not want to bias the rationale
annotations by providing labels that align better
with the intuitions of some demographics than with
those of others.
Our annotators are balanced across age and ethnicity for six demographic groups, defined by
ethnicity {Black/African American, White/Caucasian, Latino/Hispanic} and age {Old, Young}.
Therefore, we can refer to our groups as their cross-product: **{BO, BY, WO, WY, LO, LY}**.
## Dataset Details
### DynaSent
We re-annotate N=480 instances
six times (for six demographic groups), comprising
240 instances labeled as positive, and 240 instances
labeled as negative in the DynaSent Round 2 **test**
set (see [[2]](#2)). This amounts to 2,880
annotations, in total.
To annotate rationales, we formulate the task as
marking 'supporting evidence' for the label, following how the task is defined by [[6]](#6). Specifically, we ask annotators to mark
all the words, in the sentence, they think shows
evidence for their chosen label.
#### >Our annotations:
negative 1555 |
positive 1435 |
no sentiment 470
Total 3460
Note that all the data is uploaded under a single 'train' split (read [## Uses](uses) for further details).
### SST2
We re-annotate N=263 instances six
times (for six demographic groups), which are all
the positive and negative instances from the Zuco*
dataset of Hollenstein et al. (2018), comprising a
**mixture of train, validation and test** set instances
from SST-2, *which should be removed from the original SST
data before training any model*.
These 263 reannotated instances do not contain any instances originally marked as `neutral` (or not conveying sentiment) because rationale annotation for neutral instances is ill-defined. Yet,
we still allow annotators to evaluate a sentence as
neutral, since we do not want to force our annotators to provide rationales for positive and negative
sentiment that they do not see.
*The Zuco data contains eye-tracking data for 400 instances from SST. By annotating some of these with rationales,
we add an extra layer of information for future research.
#### >Our annotations:
positive 1027 |
negative 900 |
no sentiment 163
Total 2090
Note that all the data is uploaded under a single 'train' split (read [## Uses](uses) for further details).
### CoS-E
We use the simplified version of CoS-E released by [[6]](#6).
We re-annotate N=500 instances from
the CoS-E **test** set six times (for six demographic groups)
and ask annotators to firstly select the answer to
the question that they find most correct and sensible, and then mark words that justifies that answer.
Following [[7]](#7), we specify the
rationale task with a wording that should guide
annotators to make short, precise rationale annotations:
‘For each word in the question, if you
think that removing it will decrease your
confidence toward your chosen label,
please mark it.’
#### >Our annotations:
Total 3760
Note that all the data is uploaded under a single 'train' split (read [## Uses](uses) for further details).
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://github.com/terne/Being_Right_for_Whose_Right_Reasons
- **Paper:** [Being Right for Whose Right Reasons?](https://aclanthology.org/2023.acl-long.59/)
<a id="uses">## Uses</a>
<!-- Address questions around how the dataset is intended to be used. -->
In our paper, we present a collection of three
existing datasets (SST2, DynaSent and Cos-E) with demographics-augmented annotations to enable profiling of models, i.e., quantifying their alignment (or agreement) with rationales provided
by different socio-demographic groups. Such profiling enables us to ask whose right reasons models are being right for and fosters future research on performance equality/robustness.
For each dataset, we provide the data under a unique **'train'** split due to the current limitation of not being possible to upload a dataset with a single *'test'* split.
Note, however, that the original itended used of these collection of datasets was to **test** the quality & alignment of post-hoc explainability methods.
If you use it following different splits, please clarify it to ease reproducibility of your work.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
| Variable | Description |
| --- | --- |
| QID | The ID of the Question (i.e. the annotation element/sentence) in the Qualtrics survey. Every second question asked for the classification and every other asked for the rationale, of the classification, to be marked. These two questions and answers for the same sentence is merged to one row and therefore the QID looks as if every second is skipped. |
| text_id | A numerical ID given to each unique text/sentence for easy sorting before comparing annotations across groups. |
| sentence | The text/sentence that is annotated, in it's original formatting. |
| label | The (new) label given by the respective annotator/participant from Prolific. |
| label_index | The numerical format of the (new) label. |
| original_label | The label from the original dataset (Cose/Dynasent/SST). |
| rationale | The tokens marked as rationales by our annotators. |
| rationale_index | The indeces of the tokens marked as rationales. In the processed files the index start at 0. However in the unprocessed files ("_all.csv", "_before_exclussions.csv") the index starts at 1.|
| rationale_binary | A binary version of the rationales where a token marked as part of the rationale = 1 and tokens not marked = 0. |
| age | The reported age of the annotator/participant (i.e. their survey response). This may be different from the age-interval the participant was recruited by (see recruitment_age). |
| recruitment_age | The age interval specified for the Prolific job to recruit the participant by. A mismatch between this and the participant's reported age, when asked in our survey, may mean a number of things, such as: Prolific's information is wrong or outdated; the participant made a mistake when answering the question; the participant was inattentive. |
| ethnicity | The reported ethnicity of the annotator/participant. This may be different from the ethnicity the participant was recruited by (see recruitment_ethnicity). |
| recruitment_ethnicity | The ethnicity specified for the Prolific job to recruit the participant by. Sometimes there is a mismatch between the information Prolific has on participants (which we use for recruitment) and what the participants report when asked again in the survey/task. This seems especially prevalent with some ethnicities, likely because participants may in reality identify with more than one ethnic group. |
| gender | The reported gender of the annotator/participant. |
| english_proficiency | The reported English-speaking ability (proxy for English proficiency) of the annotator/participant. Options were "Not well", "Well" or "Very well". |
| attentioncheck | All participants were given a simple attention check question at the very end of the Qualtrics survey (i.e. after annotation) which was either PASSED or FAILED. Participants who failed the check were still paid for their work, but their response should be excluded from the analysis. |
| group_id | An id describing the socio-demographic subgroup a participant belongs to and was recruited by. |
| originaldata_id | The id given to the text/sentence in the original dataset. In the case of SST data, this refers to ids within the Zuco dataset – a subset of SST which was used in our study.|
| annotator_ID | Anonymised annotator ID to enable analysis such as annotators (dis)agreement |
| sst2_id | The processed SST annotations contain an extra column with the index of the text in the SST-2 dataset. -1 means that we were unable to match the text to an instance in SST-2 |
| sst2_split | The processed SST annotations contain an extra column refering to the set which the instance appears in within SST-2. Some instances a part of the train set and should therefore be removed before training a model on SST-2 and testing on our annotations. |
## Dataset Creation
### Curation Rationale
Terne Sasha Thorn Jakobsen, Laura Cabello, Anders Søgaard. Being Right for Whose Right Reasons?
In the Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers).
#### Annotation process
We refer to our [paper](https://aclanthology.org/2023.acl-long.59/) for further details on the data (Section 3), and specifically on the Annotation Process (Section 3.1) and Annotator Population (Section 3.2).
#### Who are the annotators?
Annotators were recruited via Prolific and consented to the use of their responses and demographic information for research purposes.
The annotation tasks were conducted through Qualtrics surveys. The exact surveys can be found [here](https://github.com/terne/Being_Right_for_Whose_Right_Reasons/tree/main/data/qualtrics_survey_exports).
## References
<a id="1">[1]</a>
Nazneen Fatema Rajani, Bryan McCann, Caiming Xiong, and Richard Socher. 2019. Explain Yourself! Leveraging Language Models for Commonsense Reasoning. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4932–4942, Florence, Italy. Association for Computational Linguistics.
<a id="2">[2]</a>
Christopher Potts, Zhengxuan Wu, Atticus Geiger, and Douwe Kiela. 2021. DynaSent: A Dynamic Benchmark for Sentiment Analysis. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2388–2404, Online. Association for Computational Linguistics.
<a id="3">[3]</a>
Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Christopher D. Manning, Andrew Ng, and Christopher Potts. 2013. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pages 1631–1642, Seattle, Washington, USA. Association for Computational Linguistics.
<a id="4">[4]</a>
Nora Hollenstein, Jonathan Rotsztejn, Marius Troendle, Andreas Pedroni, Ce Zhang, and Nicolas Langer. 2018. Zuco, a simultaneous eeg and eye-tracking resource for natural sentence reading. Scientific Data.
<a id="5">[5]</a>
Kate Barasz and Tami Kim. 2022. Choice perception: Making sense (and nonsense) of others’ decisions. Current opinion in psychology, 43:176–181.
<a id="6">[6]</a>
Jay DeYoung, Sarthak Jain, Nazneen Fatema Rajani, Eric Lehman, Caiming Xiong, Richard Socher, and Byron C. Wallace. 2019. Eraser: A benchmark to evaluate rationalized nlp models.
<a id="7">[7]</a>
Cheng-Han Chiang and Hung-yi Lee. 2022. Reexamining human annotations for interpretable nlp.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```bibtex
@inproceedings{thorn-jakobsen-etal-2023-right,
title = "Being Right for Whose Right Reasons?",
author = "Thorn Jakobsen, Terne Sasha and
Cabello, Laura and
S{\o}gaard, Anders",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.59",
doi = "10.18653/v1/2023.acl-long.59",
pages = "1033--1054",
abstract = "Explainability methods are used to benchmark the extent to which model predictions align with human rationales i.e., are {`}right for the right reasons{'}. Previous work has failed to acknowledge, however, that what counts as a rationale is sometimes subjective. This paper presents what we think is a first of its kind, a collection of human rationale annotations augmented with the annotators demographic information. We cover three datasets spanning sentiment analysis and common-sense reasoning, and six demographic groups (balanced across age and ethnicity). Such data enables us to ask both what demographics our predictions align with and whose reasoning patterns our models{'} rationales align with. We find systematic inter-group annotator disagreement and show how 16 Transformer-based models align better with rationales provided by certain demographic groups: We find that models are biased towards aligning best with older and/or white annotators. We zoom in on the effects of model size and model distillation, finding {--}contrary to our expectations{--} negative correlations between model size and rationale agreement as well as no evidence that either model size or model distillation improves fairness.",
}
```
## Dataset Card Contact
Thanks to [@lautel](https://github.com/lautel) for adding this dataset. |
joey234/mmlu-clinical_knowledge-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 61861
num_examples: 265
download_size: 40225
dataset_size: 61861
---
# Dataset Card for "mmlu-clinical_knowledge-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hachiouji_naoto_donttoywithmemissnagatoro | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hachiouji Naoto/八王子 (Don't Toy With Me, Miss Nagatoro)
This is the dataset of Hachiouji Naoto/八王子 (Don't Toy With Me, Miss Nagatoro), containing 798 images and their tags.
The core tags of this character are `brown_hair, short_hair, glasses, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 798 | 639.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hachiouji_naoto_donttoywithmemissnagatoro/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 798 | 639.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hachiouji_naoto_donttoywithmemissnagatoro/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1611 | 1.10 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hachiouji_naoto_donttoywithmemissnagatoro/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hachiouji_naoto_donttoywithmemissnagatoro',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1boy, collared_shirt, male_focus, solo, white_shirt, black-framed_eyewear, sweatdrop, portrait, closed_mouth, blush, indoors |
| 1 | 5 |  |  |  |  |  | 1boy, male_focus, open_mouth, parody, solo, sweatdrop, white_shirt, collared_shirt, looking_at_viewer, portrait, blush |
| 2 | 9 |  |  |  |  |  | 1boy, closed_mouth, male_focus, parody, portrait, solo, blush, over-rim_eyewear, sweatdrop, black-framed_eyewear, wavy_mouth |
| 3 | 8 |  |  |  |  |  | 1boy, collared_shirt, long_sleeves, male_focus, school_bag, school_uniform, white_shirt, black_pants, blush, closed_mouth, orange_sweater, solo_focus, sweatdrop, from_side, profile |
| 4 | 7 |  |  |  |  |  | 1boy, black_pants, collared_shirt, green_apron, male_focus, solo, white_shirt, indoors, long_sleeves, sitting, easel, canvas_(object), tile_floor, full_body, holding, open_mouth, shoes, sweater, white_footwear |
| 5 | 10 |  |  |  |  |  | 1boy, blue_sky, cloud, day, male_focus, outdoors, black-framed_eyewear, blue_jacket, solo, track_jacket, closed_mouth, upper_body, sweatdrop, blush, building, open_mouth |
| 6 | 5 |  |  |  |  |  | 1boy, black_pants, brown_belt, collared_shirt, long_sleeves, male_focus, sitting, solo, white_shirt, indoors, chair, closed_mouth, stool, sweatdrop, book, canvas_(object), easel, holding |
| 7 | 7 |  |  |  |  |  | 1boy, black_pants, indoors, male_focus, solo, white_shirt, brown_belt, long_sleeves, from_behind, sitting, canvas_(object), easel, painting_(object), standing |
| 8 | 6 |  |  |  |  |  | 1boy, brown_pants, collared_shirt, long_sleeves, male_focus, sitting, solo_focus, white_shirt, bench, black-framed_eyewear, closed_mouth, day, outdoors, bag, red_sweater, sketchbook, sweatdrop, blush, holding_pencil, tree |
| 9 | 5 |  |  |  |  |  | 1boy, male_focus, solo, closed_mouth, night, outdoors, sweatdrop, tree, upper_body, green_hoodie, green_jacket, looking_at_viewer, smile, blush |
| 10 | 8 |  |  |  |  |  | 1boy, male_focus, solo, dark, blush, forest, night, tree, outdoors |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | collared_shirt | male_focus | solo | white_shirt | black-framed_eyewear | sweatdrop | portrait | closed_mouth | blush | indoors | open_mouth | parody | looking_at_viewer | over-rim_eyewear | wavy_mouth | long_sleeves | school_bag | school_uniform | black_pants | orange_sweater | solo_focus | from_side | profile | green_apron | sitting | easel | canvas_(object) | tile_floor | full_body | holding | shoes | sweater | white_footwear | blue_sky | cloud | day | outdoors | blue_jacket | track_jacket | upper_body | building | brown_belt | chair | stool | book | from_behind | painting_(object) | standing | brown_pants | bench | bag | red_sweater | sketchbook | holding_pencil | tree | night | green_hoodie | green_jacket | smile | dark | forest |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------|:-----------------|:-------------|:-------|:--------------|:-----------------------|:------------|:-----------|:---------------|:--------|:----------|:-------------|:---------|:--------------------|:-------------------|:-------------|:---------------|:-------------|:-----------------|:--------------|:-----------------|:-------------|:------------|:----------|:--------------|:----------|:--------|:------------------|:-------------|:------------|:----------|:--------|:----------|:-----------------|:-----------|:--------|:------|:-----------|:--------------|:---------------|:-------------|:-----------|:-------------|:--------|:--------|:-------|:--------------|:--------------------|:-----------|:--------------|:--------|:------|:--------------|:-------------|:-----------------|:-------|:--------|:---------------|:---------------|:--------|:-------|:---------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | X | | X | X | X | X | X | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | | X | | X | | X | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | X | | | | | | X | X | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | X | X | | X | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | | | | | | X | | | X | | | | | | X | X | X | | | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | | | | X | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | X | | | | X | X | X | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | | | | | | | X | | | | | X | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | X | X | | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | X | X | X | X | X | | |
| 10 | 8 |  |  |  |  |  | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | | | | X | X |
|
wiki_qa | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
paperswithcode_id: wikiqa
pretty_name: WikiQA
dataset_info:
features:
- name: question_id
dtype: string
- name: question
dtype: string
- name: document_title
dtype: string
- name: answer
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: test
num_bytes: 1333261
num_examples: 6165
- name: validation
num_bytes: 589765
num_examples: 2733
- name: train
num_bytes: 4453862
num_examples: 20360
download_size: 2861208
dataset_size: 6376888
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: train
path: data/train-*
---
# Dataset Card for "wiki_qa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://www.microsoft.com/en-us/download/details.aspx?id=52419](https://www.microsoft.com/en-us/download/details.aspx?id=52419)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [WikiQA: A Challenge Dataset for Open-Domain Question Answering](https://aclanthology.org/D15-1237/)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 7.10 MB
- **Size of the generated dataset:** 6.40 MB
- **Total amount of disk used:** 13.50 MB
### Dataset Summary
Wiki Question Answering corpus from Microsoft.
The WikiQA corpus is a publicly available set of question and sentence pairs, collected and annotated for research on open-domain question answering.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 7.10 MB
- **Size of the generated dataset:** 6.40 MB
- **Total amount of disk used:** 13.50 MB
An example of 'train' looks as follows.
```
{
"answer": "Glacier caves are often called ice caves , but this term is properly used to describe bedrock caves that contain year-round ice.",
"document_title": "Glacier cave",
"label": 0,
"question": "how are glacier caves formed?",
"question_id": "Q1"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `question_id`: a `string` feature.
- `question`: a `string` feature.
- `document_title`: a `string` feature.
- `answer`: a `string` feature.
- `label`: a classification label, with possible values including `0` (0), `1` (1).
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default|20360| 2733|6165|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
MICROSOFT RESEARCH DATA LICENSE AGREEMENT
FOR
MICROSOFT RESEARCH WIKIQA CORPUS
These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its
affiliates) and you. Please read them. They apply to the data associated with this license above, which includes
the media on which you received it, if any. The terms also apply to any Microsoft:
- updates,
- supplements,
- Internet-based services, and
- support services
for this data, unless other terms accompany those items. If so, those terms apply.
BY USING THE DATA, YOU ACCEPT THESE TERMS. IF YOU DO NOT ACCEPT THEM, DO NOT USE THE DATA.
If you comply with these license terms, you have the rights below.
1. SCOPE OF LICENSE.
a. You may use, copy, modify, create derivative works, and distribute the Dataset:
i. for research and technology development purposes only. Examples of research and technology
development uses are teaching, academic research, public demonstrations and experimentation ;
and
ii. to publish (or present papers or articles) on your results from using such Dataset.
b. The data is licensed, not sold. This agreement only gives you some rights to use the data. Microsoft reserves
all other rights. Unless applicable law gives you more rights despite this limitation, you may use the data only
as expressly permitted in this agreement. In doing so, you must comply with any technical limitations in the
data that only allow you to use it in certain ways.
You may not
- work around any technical limitations in the data;
- reverse engineer, decompile or disassemble the data, except and only to the extent that applicable law
expressly permits, despite this limitation;
- rent, lease or lend the data;
- transfer the data or this agreement to any third party; or
- use the data directly in a commercial product without Microsoft’s permission.
2. DISTRIBUTION REQUIREMENTS:
a. If you distribute the Dataset or any derivative works of the Dataset, you will distribute them under the
same terms and conditions as in this Agreement, and you will not grant other rights to the Dataset or
derivative works that are different from those provided by this Agreement.
b. If you have created derivative works of the Dataset, and distribute such derivative works, you will
cause the modified files to carry prominent notices so that recipients know that they are not receiving
Page 1 of 3the original Dataset. Such notices must state: (i) that you have changed the Dataset; and (ii) the date
of any changes.
3. DISTRIBUTION RESTRICTIONS. You may not: (a) alter any copyright, trademark or patent notice in the
Dataset; (b) use Microsoft’s trademarks in a way that suggests your derivative works or modifications come from
or are endorsed by Microsoft; (c) include the Dataset in malicious, deceptive or unlawful programs.
4. OWNERSHIP. Microsoft retains all right, title, and interest in and to any Dataset provided to you under this
Agreement. You acquire no interest in the Dataset you may receive under the terms of this Agreement.
5. LICENSE TO MICROSOFT. Microsoft is granted back, without any restrictions or limitations, a non-exclusive,
perpetual, irrevocable, royalty-free, assignable and sub-licensable license, to reproduce, publicly perform or
display, use, modify, post, distribute, make and have made, sell and transfer your modifications to and/or
derivative works of the Dataset, for any purpose.
6. FEEDBACK. If you give feedback about the Dataset to Microsoft, you give to Microsoft, without charge, the right
to use, share and commercialize your feedback in any way and for any purpose. You also give to third parties,
without charge, any patent rights needed for their products, technologies and services to use or interface with
any specific parts of a Microsoft dataset or service that includes the feedback. You will not give feedback that is
subject to a license that requires Microsoft to license its Dataset or documentation to third parties because we
include your feedback in them. These rights survive this Agreement.
7. EXPORT RESTRICTIONS. The Dataset is subject to United States export laws and regulations. You must
comply with all domestic and international export laws and regulations that apply to the Dataset. These laws
include restrictions on destinations, end users and end use. For additional information, see
www.microsoft.com/exporting.
8. ENTIRE AGREEMENT. This Agreement, and the terms for supplements, updates, Internet-based services and
support services that you use, are the entire agreement for the Dataset.
9. SUPPORT SERVICES. Because this data is “as is,” we may not provide support services for it.
10. APPLICABLE LAW.
a. United States. If you acquired the software in the United States, Washington state law governs the
interpretation of this agreement and applies to claims for breach of it, regardless of conflict of laws principles.
The laws of the state where you live govern all other claims, including claims under state consumer protection
laws, unfair competition laws, and in tort.
b. Outside the United States. If you acquired the software in any other country, the laws of that country
apply.
11. LEGAL EFFECT. This Agreement describes certain legal rights. You may have other rights under the laws of your
country. You may also have rights with respect to the party from whom you acquired the Dataset. This
Agreement does not change your rights under the laws of your country if the laws of your country do not permit
it to do so.
12. DISCLAIMER OF WARRANTY. The Dataset is licensed “as-is.” You bear the risk of using it. Microsoft gives no
express warranties, guarantees or conditions. You may have additional consumer rights or statutory guarantees
under your local laws which this agreement cannot change. To the extent permitted under your local laws,
Microsoft excludes the implied warranties of merchantability, fitness for a particular purpose and non-
infringement.
13. LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES. YOU CAN RECOVER FROM
MICROSOFT AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP TO U.S. $5.00. YOU CANNOT RECOVER ANY
OTHER DAMAGES, INCLUDING CONSEQUENTIAL, LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL
DAMAGES.
This limitation applies to
- anything related to the software, services, content (including code) on third party Internet sites, or third party
programs; and Page 2 of 3
- claims for breach of contract, breach of warranty, guarantee or condition, strict liability, negligence, or other
tort to the extent permitted by applicable law.
It also applies even if Microsoft knew or should have known about the possibility of the damages. The above
limitation or exclusion may not apply to you because your country may not allow the exclusion or limitation of
incidental, consequential or other damages.
### Citation Information
```
@inproceedings{yang-etal-2015-wikiqa,
title = "{W}iki{QA}: A Challenge Dataset for Open-Domain Question Answering",
author = "Yang, Yi and
Yih, Wen-tau and
Meek, Christopher",
booktitle = "Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing",
month = sep,
year = "2015",
address = "Lisbon, Portugal",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D15-1237",
doi = "10.18653/v1/D15-1237",
pages = "2013--2018",
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham), [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
morzecrew/RefinedPersonaChat | ---
license: mit
task_categories:
- text-generation
- conversational
- text2text-generation
language:
- ru
size_categories:
- 100K<n<1M
pretty_name: ref-chat
---
---
This dataset is based on [SiberianPersonaChat Dataset](https://huggingface.co/datasets/SiberiaSoft/SiberianPersonaChat).
It was additionally filtered using:
- politics filter ([cointegrated/rubert-base-cased-nli-threeway](https://huggingface.co/cointegrated/rubert-base-cased-nli-threeway))
- toxicity filter ([cointegrated/rubert-tiny-toxicity](https://huggingface.co/cointegrated/rubert-tiny-toxicity))
- low quality qa pairs filter ([Andrilko/ruBert-base-reward](https://huggingface.co/Andrilko/ruBert-base-reward))
**Dataset Statistics:**
- wiki_qa: 4.746
- dialog_personal_context: 68.296
- russianinstructions2: 4.812
- yandexQ_instruct: 6.316
- rugpt4: 5.269
- trupalpaca: 4.284
- text_qa: 2.57
- long_answers_qa: 3.363
- chitchat: 0.198
- reaction: 0.108
- baby: 0.037
---
### Citation
```
@MISC{morzecrew/RefinedPersonaChat,
author = {Yuri Zaretskiy, Nikolas Ivanov, Igor Kuzmin},
title = {Refined dataset for conversational agents},
url = {https://huggingface.co/datasets/morzecrew/RefinedPersonaChat},
year = 2023
}
``` |
huggingartists/grimes | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/grimes"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.199833 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/8dd2a89218346f6bdb326bf84cd9eb49.1000x1000x1.png')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/grimes">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Grimes</div>
<a href="https://genius.com/artists/grimes">
<div style="text-align: center; font-size: 14px;">@grimes</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/grimes).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/grimes")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|210| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/grimes")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T01:16:14.347906](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2/blob/main/results_2023-09-23T01-16-14.347906.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0964765100671141,\n\
\ \"em_stderr\": 0.0030235709755854464,\n \"f1\": 0.15010381711409398,\n\
\ \"f1_stderr\": 0.0032252432502273593,\n \"acc\": 0.32434164506008656,\n\
\ \"acc_stderr\": 0.007374349538733694\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0964765100671141,\n \"em_stderr\": 0.0030235709755854464,\n\
\ \"f1\": 0.15010381711409398,\n \"f1_stderr\": 0.0032252432502273593\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674337\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6464088397790055,\n \"acc_stderr\": 0.013436541262599954\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T01_16_14.347906
path:
- '**/details_harness|drop|3_2023-09-23T01-16-14.347906.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T01-16-14.347906.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T01_16_14.347906
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-16-14.347906.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-16-14.347906.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:24:55.002122.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:24:55.002122.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T01_16_14.347906
path:
- '**/details_harness|winogrande|5_2023-09-23T01-16-14.347906.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T01-16-14.347906.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_24_55.002122
path:
- results_2023-07-19T17:24:55.002122.parquet
- split: 2023_09_23T01_16_14.347906
path:
- results_2023-09-23T01-16-14.347906.parquet
- split: latest
path:
- results_2023-09-23T01-16-14.347906.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T01:16:14.347906](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2/blob/main/results_2023-09-23T01-16-14.347906.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0964765100671141,
"em_stderr": 0.0030235709755854464,
"f1": 0.15010381711409398,
"f1_stderr": 0.0032252432502273593,
"acc": 0.32434164506008656,
"acc_stderr": 0.007374349538733694
},
"harness|drop|3": {
"em": 0.0964765100671141,
"em_stderr": 0.0030235709755854464,
"f1": 0.15010381711409398,
"f1_stderr": 0.0032252432502273593
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674337
},
"harness|winogrande|5": {
"acc": 0.6464088397790055,
"acc_stderr": 0.013436541262599954
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
thobauma/harmless-poisoned-0.01-dollar-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
benayas/tweet_eval | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 5425122
num_examples: 45615
- name: test
num_bytes: 1279540
num_examples: 12284
- name: validation
num_bytes: 239084
num_examples: 2000
download_size: 4849672
dataset_size: 6943746
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
liuyanchen1015/MULTI_VALUE_rte_non_coordinated_subj_obj | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 99208
num_examples: 188
- name: train
num_bytes: 83320
num_examples: 163
download_size: 131737
dataset_size: 182528
---
# Dataset Card for "MULTI_VALUE_rte_non_coordinated_subj_obj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hack90/ncbi_genbank_part_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 20345583566
num_examples: 137283
download_size: 9397135953
dataset_size: 20345583566
---
# Dataset Card for "ncbi_genbank_part_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hugomathien/equity | ---
license: unknown
---
|
liuyanchen1015/MULTI_VALUE_rte_existential_there | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 46174
num_examples: 113
- name: train
num_bytes: 54957
num_examples: 117
download_size: 73854
dataset_size: 101131
---
# Dataset Card for "MULTI_VALUE_rte_existential_there"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/elbing_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of elbing/エルビング/埃尔宾 (Azur Lane)
This is the dataset of elbing/エルビング/埃尔宾 (Azur Lane), containing 107 images and their tags.
The core tags of this character are `long_hair, breasts, heterochromia, red_eyes, blue_eyes, large_breasts, very_long_hair, white_hair, hat, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 107 | 214.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbing_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 107 | 101.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbing_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 272 | 228.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbing_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 107 | 179.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbing_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 272 | 355.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elbing_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elbing_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, official_alternate_costume, solo, red_hairband, torn_pantyhose, white_dress, cleavage, frilled_hairband, gloves, arms_up, black_pantyhose, bound, brown_pantyhose, high_heels, red_footwear |
| 1 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_gloves, necktie, braid, simple_background, white_background, holding_umbrella, black_pantyhose, skirt, thigh_strap, blush, grey_hair, shirt |
| 2 | 21 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, cleavage, nightgown, thigh_strap, two_side_up, bangs, hair_ornament, white_dress, necklace, white_panties, lying, official_alternate_costume, thighs, barefoot, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | official_alternate_costume | solo | red_hairband | torn_pantyhose | white_dress | cleavage | frilled_hairband | gloves | arms_up | black_pantyhose | bound | brown_pantyhose | high_heels | red_footwear | black_gloves | necktie | braid | simple_background | white_background | holding_umbrella | skirt | thigh_strap | blush | grey_hair | shirt | nightgown | two_side_up | bangs | hair_ornament | necklace | white_panties | lying | thighs | barefoot |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-----------------------------|:-------|:---------------|:-----------------|:--------------|:-----------|:-------------------|:---------|:----------|:------------------|:--------|:------------------|:-------------|:---------------|:---------------|:----------|:--------|:--------------------|:-------------------|:-------------------|:--------|:--------------|:--------|:------------|:--------|:------------|:--------------|:--------|:----------------|:-----------|:----------------|:--------|:---------|:-----------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | | X | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | X | X | X | | | X | X | | | | | | | | | | | | X | X | | | X | X | | | X | X | X | X | X | X | X | X | X |
|
Travad98/sogc-trademarks-1883-2001 | ---
task_categories:
- image-to-text
tags:
- economics
- legal
pretty_name: t
size_categories:
- 1K<n<10K
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 815537652.616
num_examples: 3003
download_size: 814717080
dataset_size: 815537652.616
---
|
sunhaozhepy/sst_sbert_keywords_embeddings | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: float32
- name: tokens
dtype: string
- name: tree
dtype: string
- name: keywords
dtype: string
- name: keywords_embeddings
sequence: float32
splits:
- name: train
num_bytes: 29334804
num_examples: 8544
- name: validation
num_bytes: 3783247
num_examples: 1101
- name: test
num_bytes: 7588926
num_examples: 2210
download_size: 47016395
dataset_size: 40706977
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
jlbaker361/hacs-segment-pairs | ---
dataset_info:
features:
- name: src_image
dtype: image
- name: src_pose
dtype: image
- name: target_image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 991404.0
num_examples: 4
download_size: 1000950
dataset_size: 991404.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt | ---
pretty_name: Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yihan6324/llama2-7b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T21:00:12.284244](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt/blob/main/results_2023-08-09T21%3A00%3A12.284244.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.506231001015833,\n\
\ \"acc_stderr\": 0.03505018845563652,\n \"acc_norm\": 0.5099522031118208,\n\
\ \"acc_norm_stderr\": 0.035035258453899244,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5317717765572597,\n\
\ \"mc2_stderr\": 0.015775374488304787\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536595,\n\
\ \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6041625174268074,\n\
\ \"acc_stderr\": 0.004880303863138504,\n \"acc_norm\": 0.7895837482573193,\n\
\ \"acc_norm_stderr\": 0.0040677125640782895\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.02391998416404773,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02391998416404773\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n\
\ \"acc_stderr\": 0.028384747788813332,\n \"acc_norm\": 0.532258064516129,\n\
\ \"acc_norm_stderr\": 0.028384747788813332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.034961309720561294,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.034961309720561294\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700286,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700286\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"\
acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.032365852526021574,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.032365852526021574\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.044492703500683836,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.044492703500683836\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914397,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914397\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n\
\ \"acc_stderr\": 0.01668889331080376,\n \"acc_norm\": 0.6794380587484036,\n\
\ \"acc_norm_stderr\": 0.01668889331080376\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.02670054542494367,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.02670054542494367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260659,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260659\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159615,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159615\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n\
\ \"acc_stderr\": 0.012409564470235565,\n \"acc_norm\": 0.3820078226857888,\n\
\ \"acc_norm_stderr\": 0.012409564470235565\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4918300653594771,\n \"acc_stderr\": 0.02022513434305726,\n \
\ \"acc_norm\": 0.4918300653594771,\n \"acc_norm_stderr\": 0.02022513434305726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512698,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512698\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5317717765572597,\n\
\ \"mc2_stderr\": 0.015775374488304787\n }\n}\n```"
repo_url: https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:00:12.284244.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:00:12.284244.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T21:00:12.284244.parquet'
- config_name: results
data_files:
- split: 2023_08_09T21_00_12.284244
path:
- results_2023-08-09T21:00:12.284244.parquet
- split: latest
path:
- results_2023-08-09T21:00:12.284244.parquet
---
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-40k-sharegpt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-7b-instructmining-40k-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T21:00:12.284244](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-40k-sharegpt/blob/main/results_2023-08-09T21%3A00%3A12.284244.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.506231001015833,
"acc_stderr": 0.03505018845563652,
"acc_norm": 0.5099522031118208,
"acc_norm_stderr": 0.035035258453899244,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5317717765572597,
"mc2_stderr": 0.015775374488304787
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536595,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097664
},
"harness|hellaswag|10": {
"acc": 0.6041625174268074,
"acc_stderr": 0.004880303863138504,
"acc_norm": 0.7895837482573193,
"acc_norm_stderr": 0.0040677125640782895
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02391998416404773,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02391998416404773
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.034961309720561294,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.034961309720561294
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700286,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700286
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6990825688073394,
"acc_stderr": 0.019664751366802114,
"acc_norm": 0.6990825688073394,
"acc_norm_stderr": 0.019664751366802114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.032834720561085606,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.032834720561085606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.044492703500683836,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.044492703500683836
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914397,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6794380587484036,
"acc_stderr": 0.01668889331080376,
"acc_norm": 0.6794380587484036,
"acc_norm_stderr": 0.01668889331080376
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.02670054542494367,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.02670054542494367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260659,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260659
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159615,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159615
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3820078226857888,
"acc_stderr": 0.012409564470235565,
"acc_norm": 0.3820078226857888,
"acc_norm_stderr": 0.012409564470235565
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4918300653594771,
"acc_stderr": 0.02022513434305726,
"acc_norm": 0.4918300653594771,
"acc_norm_stderr": 0.02022513434305726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512698,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512698
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5317717765572597,
"mc2_stderr": 0.015775374488304787
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DSSGxMunich/document_text | ---
license: mit
---
# Dataset Card for document_texts
## Dataset Description
* **Homepage:** [DSSGx Munich](https://sites.google.com/view/dssgx-munich-2023/startseite) organization page.
* **Repository:** [GitHub](https://github.com/DSSGxMunich/land-sealing-dataset-and-analysis).
### Dataset Summary
This dataset contains th result of the PDF parser done by Tika. It contains for each document, the land parcel it refers to and the content downloaded.
## Dataset Structure
### Data Fields
- **filename:** Name of the parsed pdf file.
- **document_id:** Unique ID of the document, it is the combination of the land parcel id_number of document from that land parcel.
- **content:** Extracted text content.
- **land_parcel_id:** Unique ID of the land parcel for the document.
- **land_parcel_name:** Name of the land parcel for the document.
- **land_parcel_scanurl:** URL for the parsed content.
### Source Data
Comes from the module document_texts_creation. |
thanhduycao/soict_private_test_v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 567746816
num_examples: 2139
download_size: 461190048
dataset_size: 567746816
---
# Dataset Card for "soict_private_test_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sadFaceEmoji/english-poems | ---
task_categories:
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
This dataset contains 93265 english poems. |
open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-chat-hf-gpt-4-80k | ---
pretty_name: Evaluation run of JCX-kcuf/Llama-2-7b-chat-hf-gpt-4-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JCX-kcuf/Llama-2-7b-chat-hf-gpt-4-80k](https://huggingface.co/JCX-kcuf/Llama-2-7b-chat-hf-gpt-4-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-chat-hf-gpt-4-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T16:38:02.853843](https://huggingface.co/datasets/open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-chat-hf-gpt-4-80k/blob/main/results_2024-03-24T16-38-02.853843.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48794065821108334,\n\
\ \"acc_stderr\": 0.034336121936290494,\n \"acc_norm\": 0.4930534896524456,\n\
\ \"acc_norm_stderr\": 0.035095955994255836,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815032,\n \"mc2\": 0.48450117635749573,\n\
\ \"mc2_stderr\": 0.015286188446075932\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4991467576791809,\n \"acc_stderr\": 0.014611369529813272,\n\
\ \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633825\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5571599283011353,\n\
\ \"acc_stderr\": 0.004957068377516512,\n \"acc_norm\": 0.746265684126668,\n\
\ \"acc_norm_stderr\": 0.0043425802776627265\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.04489539350270701,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.04489539350270701\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5645161290322581,\n \"acc_stderr\": 0.02820622559150274,\n \"\
acc_norm\": 0.5645161290322581,\n \"acc_norm_stderr\": 0.02820622559150274\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n \"\
acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6414141414141414,\n \"acc_stderr\": 0.034169036403915214,\n \"\
acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.034169036403915214\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6862385321100918,\n \"acc_stderr\": 0.019894723341469113,\n \"\
acc_norm\": 0.6862385321100918,\n \"acc_norm_stderr\": 0.019894723341469113\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6835443037974683,\n \"acc_stderr\": 0.03027497488021898,\n \
\ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.03027497488021898\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097173,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097173\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.029202540153431183,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.029202540153431183\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\
\ \"acc_stderr\": 0.0167063814150579,\n \"acc_norm\": 0.6781609195402298,\n\
\ \"acc_norm_stderr\": 0.0167063814150579\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n\
\ \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2223463687150838,\n\
\ \"acc_stderr\": 0.013907189208156881,\n \"acc_norm\": 0.2223463687150838,\n\
\ \"acc_norm_stderr\": 0.013907189208156881\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3513689700130378,\n\
\ \"acc_stderr\": 0.01219296945748402,\n \"acc_norm\": 0.3513689700130378,\n\
\ \"acc_norm_stderr\": 0.01219296945748402\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004144,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004144\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47875816993464054,\n \"acc_stderr\": 0.02020957238860025,\n \
\ \"acc_norm\": 0.47875816993464054,\n \"acc_norm_stderr\": 0.02020957238860025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433432,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815032,\n \"mc2\": 0.48450117635749573,\n\
\ \"mc2_stderr\": 0.015286188446075932\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893129\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18347232752084913,\n \
\ \"acc_stderr\": 0.010661370448699654\n }\n}\n```"
repo_url: https://huggingface.co/JCX-kcuf/Llama-2-7b-chat-hf-gpt-4-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|arc:challenge|25_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|gsm8k|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hellaswag|10_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-38-02.853843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T16-38-02.853843.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- '**/details_harness|winogrande|5_2024-03-24T16-38-02.853843.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T16-38-02.853843.parquet'
- config_name: results
data_files:
- split: 2024_03_24T16_38_02.853843
path:
- results_2024-03-24T16-38-02.853843.parquet
- split: latest
path:
- results_2024-03-24T16-38-02.853843.parquet
---
# Dataset Card for Evaluation run of JCX-kcuf/Llama-2-7b-chat-hf-gpt-4-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JCX-kcuf/Llama-2-7b-chat-hf-gpt-4-80k](https://huggingface.co/JCX-kcuf/Llama-2-7b-chat-hf-gpt-4-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-chat-hf-gpt-4-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T16:38:02.853843](https://huggingface.co/datasets/open-llm-leaderboard/details_JCX-kcuf__Llama-2-7b-chat-hf-gpt-4-80k/blob/main/results_2024-03-24T16-38-02.853843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48794065821108334,
"acc_stderr": 0.034336121936290494,
"acc_norm": 0.4930534896524456,
"acc_norm_stderr": 0.035095955994255836,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815032,
"mc2": 0.48450117635749573,
"mc2_stderr": 0.015286188446075932
},
"harness|arc:challenge|25": {
"acc": 0.4991467576791809,
"acc_stderr": 0.014611369529813272,
"acc_norm": 0.5477815699658704,
"acc_norm_stderr": 0.014544519880633825
},
"harness|hellaswag|10": {
"acc": 0.5571599283011353,
"acc_stderr": 0.004957068377516512,
"acc_norm": 0.746265684126668,
"acc_norm_stderr": 0.0043425802776627265
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.04489539350270701,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.04489539350270701
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.02820622559150274,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.02820622559150274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.034169036403915214,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.034169036403915214
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6862385321100918,
"acc_stderr": 0.019894723341469113,
"acc_norm": 0.6862385321100918,
"acc_norm_stderr": 0.019894723341469113
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.032834720561085606,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.032834720561085606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097173,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097173
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.029202540153431183,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.029202540153431183
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.0167063814150579,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.0167063814150579
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2223463687150838,
"acc_stderr": 0.013907189208156881,
"acc_norm": 0.2223463687150838,
"acc_norm_stderr": 0.013907189208156881
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3513689700130378,
"acc_stderr": 0.01219296945748402,
"acc_norm": 0.3513689700130378,
"acc_norm_stderr": 0.01219296945748402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004144,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004144
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47875816993464054,
"acc_stderr": 0.02020957238860025,
"acc_norm": 0.47875816993464054,
"acc_norm_stderr": 0.02020957238860025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433432,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815032,
"mc2": 0.48450117635749573,
"mc2_stderr": 0.015286188446075932
},
"harness|winogrande|5": {
"acc": 0.728492501973165,
"acc_stderr": 0.012499326254893129
},
"harness|gsm8k|5": {
"acc": 0.18347232752084913,
"acc_stderr": 0.010661370448699654
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DataStudio/OCR-DigitDataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: ARDIS
num_bytes: 86816946.5
num_examples: 7364
- name: MINST
num_bytes: 10222909.0
num_examples: 30000
download_size: 96510383
dataset_size: 97039855.5
configs:
- config_name: default
data_files:
- split: ARDIS
path: data/ARDIS-*
- split: MINST
path: data/MINST-*
---
|
DZN222/rafael | ---
license: openrail
---
|
distilled-from-one-sec-cv12/chunk_208 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1203607960
num_examples: 234530
download_size: 1230832909
dataset_size: 1203607960
---
# Dataset Card for "chunk_208"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naamche/geosql-llm-eval | ---
license: mit
language:
- en
size_categories:
- n<1K
---
This is a dataset made for the purpose of evaluating Text-to-SQL systems for geography-based applications.
Currently, we have only released 109 examples of natural_language, sql_query pairs.
Steps:
1. First, unzip all the .shp files and load them into your postgres database instance.
2. Load the text,sql pair from the .csv file into your desired program.
3. Generate SQL for the questions using your own LLM and compare the results any way you like.
###REQUIREMENTS.
1. You need postgres SQL installed with postgis extension enabled.
2. You need to have tiger geocoder enabled only for Florida state.
i.e., the geocoding done in this dataset is only on addresses from the Florida state.
For more information on installing tiger geocoder, see the book Postgis in Action by R. Obe, L. Hsu
Chapter 10: PostGIS TIGER geocoder
Copyright
reAlpha Tech Corp, 2024
Made by:
ML Team, Naamche |
openclimatefix/uk_pv | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- machine-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: United Kingdom PV Solar generation
size_categories:
- 1B<n<10B
source_datasets:
- original
tags:
- pv
- photovoltaic
- environment
- climate
- energy
- electricity
task_categories:
- time-series-forecasting
task_ids:
- multivariate-time-series-forecasting
---
# UK PV dataset
PV solar generation data from the UK.
This dataset contains data from 1311 PV systems from 2018 to 2021.
Time granularity varies from 2 minutes to 30 minutes.
This data is collected from live PV systems in the UK. We have obfuscated the location of the PV systems for privacy.
If you are the owner of a PV system in the dataset, and do not want this data to be shared,
please do get in contact with info@openclimatefix.org.
## Files
- metadata.csv: Data about the PV systems, e.g location
- 2min.parquet: Power output for PV systems every 2 minutes.
- 5min.parquet: Power output for PV systems every 5 minutes.
- 30min.parquet: Power output for PV systems every 30 minutes.
- pv.netcdf: (legacy) Time series of PV solar generation every 5 minutes
### metadata.csv
Metadata of the different PV systems.
Note that there are extra PV systems in this metadata that do not appear in the PV time-series data.
The csv columns are:
- ss_id: the id of the system
- latitude_rounded: latitude of the PV system, but rounded to approximately the nearest km
- longitude_rounded: latitude of the PV system, but rounded to approximately the nearest km
- llsoacd: TODO
- orientation: The orientation of the PV system
- tilt: The tilt of the PV system
- kwp: The capacity of the PV system
- operational_at: the datetime the PV system started working
### {2,5,30}min.parquet
Time series of solar generation for a number of sytems.
Each file includes the systems for which there is enough granularity.
In particular the systems in 2min.parquet and 5min.parquet are also in 30min.parquet.
The files contain 3 columns:
- ss_id: the id of the system
- timestamp: the timestamp
- generation_wh: the generated power (in kW) at the given timestamp for the given system
### pv.netcdf (legacy)
Time series data of PV solar generation data is in an [xarray](https://docs.xarray.dev/en/stable/) format.
The data variables are the same as 'ss_id' in the metadata.
Each data variable contains the solar generation (in kW) for that PV system.
The ss_id's here are a subset of all the ss_id's in the metadata
The coordinates of the date are tagged as 'datetime' which is the datetime of the solar generation reading.
This is a subset of the more recent `5min.parquet` file.
## example
using Hugging Face Datasets
```python
from datasets import load_dataset
dataset = load_dataset("openclimatefix/uk_pv")
```
## useful links
https://huggingface.co/docs/datasets/share - this repo was made by following this tutorial |
DynamicSuperbPrivate/HowFarAreYou_3DSpeaker | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 876846831.63
num_examples: 9253
download_size: 840306291
dataset_size: 876846831.63
---
# Dataset Card for "HowFarAreYou_3DSpeaker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BaorBaor/60k_data_multichoice | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: context
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: E
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 330281409
num_examples: 60347
- name: valid
num_bytes: 1112116
num_examples: 200
download_size: 183246252
dataset_size: 331393525
---
# Dataset Card for "60k_data_multichoice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ruliad/stack-v2-python-with-content-v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: repo_name
dtype: string
splits:
- name: train
num_bytes: 39570777787
num_examples: 10518988
download_size: 13545022349
dataset_size: 39570777787
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Denisilva/VOZSuellen_k | ---
license: openrail
---
|
hippocrates/PubMedQA | ---
license: apache-2.0
---
|
ondevicellm/tulu-v2-sft-mixture | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1239293363
num_examples: 326154
download_size: 554602355
dataset_size: 1239293363
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
varox34/demo | ---
YAML tags:
annotations_creators:
- expert-generated
language:
- es
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: UD_Spanish-AnCora
size_categories: []
source_datasets: []
tags: []
task_categories:
- token-classification
task_ids:
- part-of-speech
---
# UD_Spanish-AnCora
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Website:** https://github.com/UniversalDependencies/UD_Spanish-AnCora
- **Point of Contact:** [Daniel Zeman](zeman@ufal.mff.cuni.cz)
### Dataset Summary
This dataset is composed of the annotations from the [AnCora corpus](http://clic.ub.edu/corpus/), projected on the [Universal Dependencies treebank](https://universaldependencies.org/). We use the POS annotations of this corpus as part of the EvalEs Spanish language benchmark.
### Supported Tasks and Leaderboards
POS tagging
### Languages
The dataset is in Spanish (`es-ES`)
## Dataset Structure
### Data Instances
Three conllu files.
Annotations are encoded in plain text files (UTF-8, normalized to NFC, using only the LF character as line break, including an LF character at the end of file) with three types of lines:
1) Word lines containing the annotation of a word/token in 10 fields separated by single tab characters (see below).
2) Blank lines marking sentence boundaries.
3) Comment lines starting with hash (#).
### Data Fields
Word lines contain the following fields:
1) ID: Word index, integer starting at 1 for each new sentence; may be a range for multiword tokens; may be a decimal number for empty nodes (decimal numbers can be lower than 1 but must be greater than 0).
2) FORM: Word form or punctuation symbol.
3) LEMMA: Lemma or stem of word form.
4) UPOS: Universal part-of-speech tag.
5) XPOS: Language-specific part-of-speech tag; underscore if not available.
6) FEATS: List of morphological features from the universal feature inventory or from a defined language-specific extension; underscore if not available.
7) HEAD: Head of the current word, which is either a value of ID or zero (0).
8) DEPREL: Universal dependency relation to the HEAD (root iff HEAD = 0) or a defined language-specific subtype of one.
9) DEPS: Enhanced dependency graph in the form of a list of head-deprel pairs.
10) MISC: Any other annotation.
From: [https://universaldependencies.org](https://universaldependencies.org/guidelines.html)
### Data Splits
- es_ancora-ud-train.conllu
- es_ancora-ud-dev.conllu
- es_ancora-ud-test.conllu
## Dataset Creation
### Curation Rationale
[N/A]
### Source Data
[UD_Spanish-AnCora](https://github.com/UniversalDependencies/UD_Spanish-AnCora)
#### Initial Data Collection and Normalization
The original annotation was done in a constituency framework as a part of the [AnCora project](http://clic.ub.edu/corpus/) at the University of Barcelona. It was converted to dependencies by the [Universal Dependencies team](https://universaldependencies.org/) and used in the CoNLL 2009 shared task. The CoNLL 2009 version was later converted to HamleDT and to Universal Dependencies.
For more information on the AnCora project, visit the [AnCora site](http://clic.ub.edu/corpus/).
To learn about the Universal Dependences, visit the webpage [https://universaldependencies.org](https://universaldependencies.org)
#### Who are the source language producers?
For more information on the AnCora corpus and its sources, visit the [AnCora site](http://clic.ub.edu/corpus/).
### Annotations
#### Annotation process
For more information on the first AnCora annotation, visit the [AnCora site](http://clic.ub.edu/corpus/).
#### Who are the annotators?
For more information on the AnCora annotation team, visit the [AnCora site](http://clic.ub.edu/corpus/).
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset contributes to the development of language models in Spanish.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
[N/A]
### Licensing Information
This work is licensed under a <a rel="license" href="https://creativecommons.org/licenses/by/4.0/">CC Attribution 4.0 International License</a>.
### Citation Information
The following paper must be cited when using this corpus:
Taulé, M., M.A. Martí, M. Recasens (2008) 'Ancora: Multilevel Annotated Corpora for Catalan and Spanish', Proceedings of 6th International Conference on Language Resources and Evaluation. Marrakesh (Morocco).
To cite the Universal Dependencies project:
Rueter, J. (Creator), Erina, O. (Contributor), Klementeva, J. (Contributor), Ryabov, I. (Contributor), Tyers, F. M. (Contributor), Zeman, D. (Contributor), Nivre, J. (Creator) (15 Nov 2020). Universal Dependencies version 2.7 Erzya JR. Universal Dependencies Consortium.
### Contributions
[N/A]
|
open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B | ---
pretty_name: Evaluation run of EleutherAI/gpt-neo-2.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/gpt-neo-2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T18:17:27.118418](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B/blob/main/results_2023-09-16T18-17-27.118418.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196460643,\n \"f1\": 0.04774853187919481,\n\
\ \"f1_stderr\": 0.0012502430800989544,\n \"acc\": 0.3067599823596958,\n\
\ \"acc_stderr\": 0.008435917406608623\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460643,\n\
\ \"f1\": 0.04774853187919481,\n \"f1_stderr\": 0.0012502430800989544\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \
\ \"acc_stderr\": 0.003106901266499639\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6006314127861089,\n \"acc_stderr\": 0.013764933546717609\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/gpt-neo-2.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T18_17_27.118418
path:
- '**/details_harness|drop|3_2023-09-16T18-17-27.118418.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T18-17-27.118418.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T18_17_27.118418
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-17-27.118418.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-17-27.118418.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:18:37.000373.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:18:37.000373.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T18_17_27.118418
path:
- '**/details_harness|winogrande|5_2023-09-16T18-17-27.118418.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T18-17-27.118418.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_18_37.000373
path:
- results_2023-07-19T17:18:37.000373.parquet
- split: 2023_09_16T18_17_27.118418
path:
- results_2023-09-16T18-17-27.118418.parquet
- split: latest
path:
- results_2023-09-16T18-17-27.118418.parquet
---
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-neo-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T18:17:27.118418](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B/blob/main/results_2023-09-16T18-17-27.118418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460643,
"f1": 0.04774853187919481,
"f1_stderr": 0.0012502430800989544,
"acc": 0.3067599823596958,
"acc_stderr": 0.008435917406608623
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460643,
"f1": 0.04774853187919481,
"f1_stderr": 0.0012502430800989544
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499639
},
"harness|winogrande|5": {
"acc": 0.6006314127861089,
"acc_stderr": 0.013764933546717609
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T | ---
pretty_name: Evaluation run of OEvortex/HelpingAI-Lite-1.5T
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OEvortex/HelpingAI-Lite-1.5T](https://huggingface.co/OEvortex/HelpingAI-Lite-1.5T)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T06:17:09.699346](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T/blob/main/results_2024-03-10T06-17-09.699346.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2635465581409758,\n\
\ \"acc_stderr\": 0.031199778547091002,\n \"acc_norm\": 0.26467294429469646,\n\
\ \"acc_norm_stderr\": 0.03197040307669128,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3861173734844904,\n\
\ \"mc2_stderr\": 0.014144546234841945\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28924914675767915,\n \"acc_stderr\": 0.013250012579393443,\n\
\ \"acc_norm\": 0.3122866894197952,\n \"acc_norm_stderr\": 0.013542598541688065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40838478390758814,\n\
\ \"acc_stderr\": 0.00490530437109087,\n \"acc_norm\": 0.5238996215893248,\n\
\ \"acc_norm_stderr\": 0.004984077906216095\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.032790004063100515,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.032790004063100515\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241238,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292326,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292326\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.0303137105381989,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.0303137105381989\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775296,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.02110773012724398,\n \
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.02110773012724398\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722734,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722734\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2107843137254902,\n \"acc_stderr\": 0.028626547912437416,\n \"\
acc_norm\": 0.2107843137254902,\n \"acc_norm_stderr\": 0.028626547912437416\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.01588988836256049,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.01588988836256049\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0230836585869842,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0230836585869842\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808868,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808868\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.2379421221864952,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.010926496102034947,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.010926496102034947\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401466,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3861173734844904,\n\
\ \"mc2_stderr\": 0.014144546234841945\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5832675611681136,\n \"acc_stderr\": 0.013856250072796316\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723889967\n }\n}\n```"
repo_url: https://huggingface.co/OEvortex/HelpingAI-Lite-1.5T
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|winogrande|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T06-17-09.699346.parquet'
- config_name: results
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- results_2024-03-10T06-17-09.699346.parquet
- split: latest
path:
- results_2024-03-10T06-17-09.699346.parquet
---
# Dataset Card for Evaluation run of OEvortex/HelpingAI-Lite-1.5T
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OEvortex/HelpingAI-Lite-1.5T](https://huggingface.co/OEvortex/HelpingAI-Lite-1.5T) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T06:17:09.699346](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T/blob/main/results_2024-03-10T06-17-09.699346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2635465581409758,
"acc_stderr": 0.031199778547091002,
"acc_norm": 0.26467294429469646,
"acc_norm_stderr": 0.03197040307669128,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3861173734844904,
"mc2_stderr": 0.014144546234841945
},
"harness|arc:challenge|25": {
"acc": 0.28924914675767915,
"acc_stderr": 0.013250012579393443,
"acc_norm": 0.3122866894197952,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.40838478390758814,
"acc_stderr": 0.00490530437109087,
"acc_norm": 0.5238996215893248,
"acc_norm_stderr": 0.004984077906216095
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.032790004063100515,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.032790004063100515
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241238,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.03396116205845335,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.03396116205845335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292326,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292326
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.02110773012724398,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.02110773012724398
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722734,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2107843137254902,
"acc_stderr": 0.028626547912437416,
"acc_norm": 0.2107843137254902,
"acc_norm_stderr": 0.028626547912437416
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.01588988836256049,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.01588988836256049
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0230836585869842,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0230836585869842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808868,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808868
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034947,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034947
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401466,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3861173734844904,
"mc2_stderr": 0.014144546234841945
},
"harness|winogrande|5": {
"acc": 0.5832675611681136,
"acc_stderr": 0.013856250072796316
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723889967
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
linhtran92/viet_vlsp | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 24081636306.031
num_examples: 171441
- name: validation
num_bytes: 1046661092.259
num_examples: 7501
download_size: 25080683463
dataset_size: 25128297398.289997
---
# Dataset Card for "viet_vlsp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
somosnlp/somos_alpaca_validation_agreement | ---
dataset_info:
features:
- name: id
dtype: string
- name: annotation
dtype: string
- name: count
dtype: int64
splits:
- name: train
num_bytes: 777430
num_examples: 12615
download_size: 477855
dataset_size: 777430
---
# Dataset Card for "somos_alpaca_validation_agreement"
El conjunto de datos de acuerdo, resultado de un esfuerzo colaborativo para limpiar el dataset Alpaca, reúne anotaciones en las que existe consenso entre los anotadores. Este conjunto de datos es de gran utilidad para identificar casos en los que se alcanza un acuerdo claro en las etiquetas asignadas, permitiendo así mejorar la calidad y confiabilidad de los datos. A continuación, presentamos una representación gráfica que muestra la distribución y cantidad de cada anotación en el conjunto de datos de acuerdo.

La mejora del dataset está en progreso pero queremos agradecer a todos los participantes que han aportado los siguientes datasets. Una vez se finalice el proceso se incluirán todos los nombres en los agradecimientos:
```python
dataset_urls = [
"beta3/somos-clean-alpaca-es-validations",
"Sebastian77/somos-alpaca-es",
"lopezjm96/somos-clean-alpaca-es-validations",
"Sebastian77/somos-alpaca-es",
"abrazador/somos-alpaca-es-mario",
"maga12/somos-clean-alpaca-es-validations",
"monicaeme/somos-alpaca-es",
"dvilasuero/somos-alpaca-es-intro",
"mserras/alpaca-es-hackaton-validated",
"dariolopez/somos-clean-alpaca-es-validations",
"alarcon7a/somos-clean-alpaca-es-validations",
"nataliaElv/somos-clean-alpaca-es-validations",
"hackathon-somos-nlp-2023/alpaca-es-agentes"
]
``` |
CyberHarem/feynman_yoshino_renaiflops | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Feynman Yoshino
This is the dataset of Feynman Yoshino, containing 74 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 74 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 168 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 207 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 74 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 74 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 74 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 168 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 168 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 142 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 207 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 207 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
meet1812/meet_data_1 | ---
dataset_info:
features:
- name: new_text
dtype: string
splits:
- name: train
num_bytes: 269273
num_examples: 1000
download_size: 112000
dataset_size: 269273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fai/testingdataset | ---
license: mit
---
|
vietgpt-archive/thuvienphapluat_qa_vi | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: time
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 887637549
num_examples: 292167
download_size: 268836431
dataset_size: 887637549
---
# Dataset Card for "thuvienphapluat_qa_vi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-project-adversarial_qa-92a1abad-1303449870 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: nbroad/rob-base-superqa2
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: test
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: nbroad/rob-base-superqa2
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model. |
DaviGamer/KennyMcCormick | ---
license: openrail
---
|
mtkinit/SuperDataset293810 | ---
pretty_name: SuperDataset293810
tags:
- uci
- world
---
# SuperDataset293810
Created from AIOD platform |
MajdTannous/Dataset2 | ---
pretty_name: SQuAD
viewer: true
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: squad
train-eval-index:
- config: plain_text
task: question-answering
task_id: extractive_question_answering
splits:
train_split: train
eval_split: validation
col_mapping:
question: question
context: context
answers:
text: text
answer_start: answer_start
metrics:
- type: squad
name: SQuAD
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
config_name: plain_text
splits:
- name: train
num_bytes: 79317110
num_examples: 87599
- name: validation
num_bytes: 10472653
num_examples: 10570
download_size: 35142551
dataset_size: 89789763
---
# Dataset Card for "squad"
## Table of Contents
- [Dataset Card for "squad"](#dataset-card-for-squad)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [plain_text](#plain_text)
- [Data Fields](#data-fields)
- [plain_text](#plain_text-1)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://rajpurkar.github.io/SQuAD-explorer/](https://rajpurkar.github.io/SQuAD-explorer/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 35.14 MB
- **Size of the generated dataset:** 89.92 MB
- **Total amount of disk used:** 125.06 MB
### Dataset Summary
Stanford Question Answering Dataset (SQuAD) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### plain_text
- **Size of downloaded dataset files:** 35.14 MB
- **Size of the generated dataset:** 89.92 MB
- **Total amount of disk used:** 125.06 MB
An example of 'train' looks as follows.
```
{
"answers": {
"answer_start": [1],
"text": ["This is a test text"]
},
"context": "This is a test context.",
"id": "1",
"question": "Is this a test?",
"title": "train test"
}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name |train|validation|
|----------|----:|---------:|
|plain_text|87599| 10570|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{2016arXiv160605250R,
author = {{Rajpurkar}, Pranav and {Zhang}, Jian and {Lopyrev},
Konstantin and {Liang}, Percy},
title = "{SQuAD: 100,000+ Questions for Machine Comprehension of Text}",
journal = {arXiv e-prints},
year = 2016,
eid = {arXiv:1606.05250},
pages = {arXiv:1606.05250},
archivePrefix = {arXiv},
eprint = {1606.05250},
}
``` |
reddyprasade/Q_A_Dataset | ---
license: apache-2.0
---
|
ostapeno/dolly | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 13007120
num_examples: 15011
download_size: 7493126
dataset_size: 13007120
---
# Dataset Card for "dolly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
senhorsapo/shadow | ---
license: openrail
---
|
carlosug/ResearchInstall | ---
license: mit
---
# Dataset Card for RSInstall Corpus
### Dataset Description
#### Links
+ **Repository:**
+ **Point of Contact:**
#### Dataset Summary
RSInstall is a small-scale text to unified representation dataset, consisting of 30 installation instructions with corresponding manually labeled plans, steps and topics.
annotations each. For more information about the definition please go: [repo]()
#### Language
English
#### Data Structure
##### Data Instance
....
##### Data Fields
- software,
- repo_name,
- readme_url,
- content,
- plan,
- steps,
- optional_steps,
- extra_info_optional
#### Dataset Creation
##### Curation Rationale
...
#### Who are the source language producers?
Humans creating software
#### Who are the annotators
Researchers on AI/ML
#### Licensing Information
mit
#### Citation
.... |
hesh0629/celebA_LLaVA | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 11471009427.626
num_examples: 202599
download_size: 10486425131
dataset_size: 11471009427.626
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-financial_phrasebank-sentences_allagree-c1bf87-48200145240 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- financial_phrasebank
eval_info:
task: multi_class_classification
model: ahmedrachid/FinancialBERT-Sentiment-Analysis
metrics: ['bleu', 'google_bleu']
dataset_name: financial_phrasebank
dataset_config: sentences_allagree
dataset_split: train
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: ahmedrachid/FinancialBERT-Sentiment-Analysis
* Dataset: financial_phrasebank
* Config: sentences_allagree
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@du](https://huggingface.co/du) for evaluating this model. |
open-llm-leaderboard/details_mahiatlinux__MasherAI-7B-v3 | ---
pretty_name: Evaluation run of mahiatlinux/MasherAI-7B-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mahiatlinux/MasherAI-7B-v3](https://huggingface.co/mahiatlinux/MasherAI-7B-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mahiatlinux__MasherAI-7B-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T15:20:09.015425](https://huggingface.co/datasets/open-llm-leaderboard/details_mahiatlinux__MasherAI-7B-v3/blob/main/results_2024-03-21T15-20-09.015425.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6394628958949224,\n\
\ \"acc_stderr\": 0.03213248204322483,\n \"acc_norm\": 0.6434780347902122,\n\
\ \"acc_norm_stderr\": 0.032780101431233326,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.47627833893064514,\n\
\ \"mc2_stderr\": 0.01515113122576049\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946705,\n\
\ \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.014027516814585186\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6202947619996017,\n\
\ \"acc_stderr\": 0.004843216325090254,\n \"acc_norm\": 0.82194781915953,\n\
\ \"acc_norm_stderr\": 0.003817748269107782\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400352,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400352\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847835,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847835\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368976,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368976\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.0151665445504903,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.0151665445504903\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083143,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083143\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.47627833893064514,\n\
\ \"mc2_stderr\": 0.01515113122576049\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019823\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4715693707354056,\n \
\ \"acc_stderr\": 0.013750202076584424\n }\n}\n```"
repo_url: https://huggingface.co/mahiatlinux/MasherAI-7B-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|arc:challenge|25_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|gsm8k|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hellaswag|10_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-20-09.015425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T15-20-09.015425.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- '**/details_harness|winogrande|5_2024-03-21T15-20-09.015425.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T15-20-09.015425.parquet'
- config_name: results
data_files:
- split: 2024_03_21T15_20_09.015425
path:
- results_2024-03-21T15-20-09.015425.parquet
- split: latest
path:
- results_2024-03-21T15-20-09.015425.parquet
---
# Dataset Card for Evaluation run of mahiatlinux/MasherAI-7B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mahiatlinux/MasherAI-7B-v3](https://huggingface.co/mahiatlinux/MasherAI-7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mahiatlinux__MasherAI-7B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T15:20:09.015425](https://huggingface.co/datasets/open-llm-leaderboard/details_mahiatlinux__MasherAI-7B-v3/blob/main/results_2024-03-21T15-20-09.015425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6394628958949224,
"acc_stderr": 0.03213248204322483,
"acc_norm": 0.6434780347902122,
"acc_norm_stderr": 0.032780101431233326,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.47627833893064514,
"mc2_stderr": 0.01515113122576049
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946705,
"acc_norm": 0.6399317406143344,
"acc_norm_stderr": 0.014027516814585186
},
"harness|hellaswag|10": {
"acc": 0.6202947619996017,
"acc_stderr": 0.004843216325090254,
"acc_norm": 0.82194781915953,
"acc_norm_stderr": 0.003817748269107782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400352,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400352
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847835,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847835
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368976,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368976
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.0151665445504903,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.0151665445504903
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083143,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.47627833893064514,
"mc2_stderr": 0.01515113122576049
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019823
},
"harness|gsm8k|5": {
"acc": 0.4715693707354056,
"acc_stderr": 0.013750202076584424
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_9 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1078490692.0
num_examples: 211801
download_size: 1098404186
dataset_size: 1078490692.0
---
# Dataset Card for "chunk_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Circularmachines/batch_indexing_machine_green_test_chroma | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: test
num_bytes: 77300794.0
num_examples: 420
download_size: 76248327
dataset_size: 77300794.0
---
# Dataset Card for "batch_indexing_machine_green_test_chroma"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UnderstandLing/oasst1_hi | ---
license: apache-2.0
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 103419755
num_examples: 81870
- name: validation
num_bytes: 4384159
num_examples: 3401
download_size: 29829039
dataset_size: 107803914
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Vikhrmodels/LLava_Instruct | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 10134246
num_examples: 8000
- name: test
num_bytes: 2515383
num_examples: 2000
download_size: 5627066
dataset_size: 12649629
---
# Dataset Card for "LLava_Instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
guangyil/amazon_tokenized | ---
dataset_info:
features:
- name: bert_token
sequence: int64
- name: gpt2_token
sequence: int64
splits:
- name: train
num_bytes: 173553456.7202345
num_examples: 551455
- name: test
num_bytes: 261864.0
num_examples: 1000
download_size: 42652803
dataset_size: 173815320.7202345
---
# Dataset Card for "amazon_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deutsche-telekom/ger-backtrans-paraphrase | ---
license:
- cc-by-sa-4.0
language:
- de
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
task_categories:
- sentence-similarity
---
# German Backtranslated Paraphrase Dataset
This is a dataset of more than 21 million German paraphrases.
These are text pairs that have the same meaning but are expressed with different words.
The source of the paraphrases are different parallel German / English text corpora.
The English texts were machine translated back into German to obtain the paraphrases.
This dataset can be used for example to train semantic text embeddings.
To do this, for example, [SentenceTransformers](https://www.sbert.net/)
and the [MultipleNegativesRankingLoss](https://www.sbert.net/docs/package_reference/losses.html#multiplenegativesrankingloss)
can be used.
## Creator
This data set was compiled and open sourced by [Philip May](https://may.la/)
of [Deutsche Telekom](https://www.telekom.de/).
## Our pre-processing
Apart from the back translation, we have added more columns (for details see below). We have carried out the following pre-processing and filtering:
- We dropped text pairs where one text was longer than 499 characters.
- In the [GlobalVoices v2018q4](https://opus.nlpl.eu/GlobalVoices-v2018q4.php) texts we have removed the `" · Global Voices"` suffix.
## Your post-processing
You probably don't want to use the dataset as it is, but filter it further.
This is what the additional columns of the dataset are for.
For us it has proven useful to delete the following pairs of sentences:
- `min_char_len` less than 15
- `jaccard_similarity` greater than 0.3
- `de_token_count` greater than 30
- `en_de_token_count` greater than 30
- `cos_sim` less than 0.85
## Columns description
- **`uuid`**: a uuid calculated with Python `uuid.uuid4()`
- **`en`**: the original English texts from the corpus
- **`de`**: the original German texts from the corpus
- **`en_de`**: the German texts translated back from English (from `en`)
- **`corpus`**: the name of the corpus
- **`min_char_len`**: the number of characters of the shortest text
- **`jaccard_similarity`**: the [Jaccard similarity coefficient](https://en.wikipedia.org/wiki/Jaccard_index) of both sentences - see below for more details
- **`de_token_count`**: number of tokens of the `de` text, tokenized with [deepset/gbert-large](https://huggingface.co/deepset/gbert-large)
- **`en_de_token_count`**: number of tokens of the `de` text, tokenized with [deepset/gbert-large](https://huggingface.co/deepset/gbert-large)
- **`cos_sim`**: the [cosine similarity](https://en.wikipedia.org/wiki/Cosine_similarity) of both sentences measured with [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2)
## Anomalies in the texts
It is noticeable that the [OpenSubtitles](https://opus.nlpl.eu/OpenSubtitles-v2018.php) texts have weird dash prefixes. This looks like this:
```
- Hast du was draufgetan?
```
To remove them you could apply this function:
```python
import re
def clean_text(text):
text = re.sub("^[-\s]*", "", text)
text = re.sub("[-\s]*$", "", text)
return text
df["de"] = df["de"].apply(clean_text)
df["en_de"] = df["en_de"].apply(clean_text)
```
## Parallel text corpora used
| Corpus name & link | Number of paraphrases |
|-----------------------------------------------------------------------|----------------------:|
| [OpenSubtitles](https://opus.nlpl.eu/OpenSubtitles-v2018.php) | 18,764,810 |
| [WikiMatrix v1](https://opus.nlpl.eu/WikiMatrix-v1.php) | 1,569,231 |
| [Tatoeba v2022-03-03](https://opus.nlpl.eu/Tatoeba-v2022-03-03.php) | 313,105 |
| [TED2020 v1](https://opus.nlpl.eu/TED2020-v1.php) | 289,374 |
| [News-Commentary v16](https://opus.nlpl.eu/News-Commentary-v16.php) | 285,722 |
| [GlobalVoices v2018q4](https://opus.nlpl.eu/GlobalVoices-v2018q4.php) | 70,547 |
| **sum** |. **21,292,789** |
## Back translation
We have made the back translation from English to German with the help of [Fairseq](https://github.com/facebookresearch/fairseq).
We used the `transformer.wmt19.en-de` model for this purpose:
```python
en2de = torch.hub.load(
"pytorch/fairseq",
"transformer.wmt19.en-de",
checkpoint_file="model1.pt:model2.pt:model3.pt:model4.pt",
tokenizer="moses",
bpe="fastbpe",
)
```
## How the Jaccard similarity was calculated
To calculate the [Jaccard similarity coefficient](https://en.wikipedia.org/wiki/Jaccard_index)
we are using the [SoMaJo tokenizer](https://github.com/tsproisl/SoMaJo)
to split the texts into tokens.
We then `lower()` the tokens so that upper and lower case letters no longer make a difference. Below you can find a code snippet with the details:
```python
from somajo import SoMaJo
LANGUAGE = "de_CMC"
somajo_tokenizer = SoMaJo(LANGUAGE)
def get_token_set(text, somajo_tokenizer):
sentences = somajo_tokenizer.tokenize_text([text])
tokens = [t.text.lower() for sentence in sentences for t in sentence]
token_set = set(tokens)
return token_set
def jaccard_similarity(text1, text2, somajo_tokenizer):
token_set1 = get_token_set(text1, somajo_tokenizer=somajo_tokenizer)
token_set2 = get_token_set(text2, somajo_tokenizer=somajo_tokenizer)
intersection = token_set1.intersection(token_set2)
union = token_set1.union(token_set2)
jaccard_similarity = float(len(intersection)) / len(union)
return jaccard_similarity
```
## Load this dataset
### With Hugging Face Datasets
```python
# pip install datasets
from datasets import load_dataset
dataset = load_dataset("deutsche-telekom/ger-backtrans-paraphrase")
train_dataset = dataset["train"]
```
### With Pandas
If you want to download the csv file and then load it with Pandas you can do it like this:
```python
df = pd.read_csv("train.csv")
```
## Citations, Acknowledgements and Licenses
**OpenSubtitles**
- citation: P. Lison and J. Tiedemann, 2016, [OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles](http://www.lrec-conf.org/proceedings/lrec2016/pdf/947_Paper.pdf). In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016)
- also see http://www.opensubtitles.org/
- license: no special license has been provided at OPUS for this dataset
**WikiMatrix v1**
- citation: Holger Schwenk, Vishrav Chaudhary, Shuo Sun, Hongyu Gong and Paco Guzman, [WikiMatrix: Mining 135M Parallel Sentences in 1620 Language Pairs from Wikipedia](https://arxiv.org/abs/1907.05791), arXiv, July 11 2019
- license: [CC-BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
**Tatoeba v2022-03-03**
- citation: J. Tiedemann, 2012, [Parallel Data, Tools and Interfaces in OPUS](https://opus.nlpl.eu/Tatoeba-v2022-03-03.php). In Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012)
- license: [CC BY 2.0 FR](https://creativecommons.org/licenses/by/2.0/fr/)
- copyright: https://tatoeba.org/eng/terms_of_use
**TED2020 v1**
- citation: Reimers, Nils and Gurevych, Iryna, [Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation](https://arxiv.org/abs/2004.09813), In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, November 2020
- acknowledgements to [OPUS](https://opus.nlpl.eu/) for this service
- license: please respect the [TED Talks Usage Policy](https://www.ted.com/about/our-organization/our-policies-terms/ted-talks-usage-policy)
**News-Commentary v16**
- citation: J. Tiedemann, 2012, [Parallel Data, Tools and Interfaces in OPUS](https://opus.nlpl.eu/Tatoeba-v2022-03-03.php). In Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012)
- license: no special license has been provided at OPUS for this dataset
**GlobalVoices v2018q4**
- citation: J. Tiedemann, 2012, [Parallel Data, Tools and Interfaces in OPUS](https://opus.nlpl.eu/Tatoeba-v2022-03-03.php). In Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012)
- license: no special license has been provided at OPUS for this dataset
## Citation
```latex
@misc{ger-backtrans-paraphrase,
title={Deutsche-Telekom/ger-backtrans-paraphrase - dataset at Hugging Face},
url={https://huggingface.co/datasets/deutsche-telekom/ger-backtrans-paraphrase},
year={2022},
author={May, Philip}
}
```
## Licensing
Copyright (c) 2022 [Philip May](https://may.la/),
[Deutsche Telekom AG](https://www.telekom.com/)
This work is licensed under [CC-BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/).
|
Tamazight-NLP/FLORES-200-Standard-Moroccan-Tamazight | ---
license: cc-by-sa-4.0
task_categories:
- translation
- text2text-generation
language:
- en
- zgh
- ber
annotations_creators:
- expert-generated
pretty_name: FLORES 200 (Standard Moroccan Tamazight)
size_categories:
- 1K<n<10K
--- |
presencesw/dataset4_translated | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: references
sequence: string
- name: question_vi
dtype: string
- name: answer_vi
dtype: string
- name: references_vi
sequence: string
splits:
- name: train
num_bytes: 46459947
num_examples: 7579
- name: validation
num_bytes: 6144964
num_examples: 1000
- name: test
num_bytes: 2479029
num_examples: 400
download_size: 28297523
dataset_size: 55083940
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
awhall/aita_21-10_23-09 | ---
license: mit
---
|
mehdiselbi/snoopdogg-QA | ---
license: mit
---
|
ashu3984/PHYSIGENAI-phy-small | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 346786
num_examples: 785
download_size: 112107
dataset_size: 346786
---
# Dataset Card for "PHYSIGENAI-phy-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aciborowska/customers-complaints-test | ---
dataset_info:
features:
- name: Date_received
dtype: string
- name: Product
dtype: string
- name: Sub_product
dtype: string
- name: Issue
dtype: string
- name: Sub_issue
dtype: string
- name: Consumer_complaint_narrative
dtype: string
- name: Company_public_response
dtype: string
- name: Company
dtype: string
- name: State
dtype: string
- name: ZIP_code
dtype: string
- name: Tags
dtype: string
- name: Consumer_consent_provided?
dtype: string
- name: Submitted_via
dtype: string
- name: Date_sent_to_company
dtype: string
- name: Company response to consumer
dtype: string
- name: Timely_response?
dtype: string
- name: Consumer_disputed?
dtype: string
- name: Complaint_ID
dtype: int64
splits:
- name: train
num_bytes: 4068482
num_examples: 3000
download_size: 1612360
dataset_size: 4068482
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "customers-complaints-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_second_sent_train_50_eval_10_baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 80686
num_examples: 50
- name: validation
num_bytes: 15357
num_examples: 10
download_size: 0
dataset_size: 96043
---
# Dataset Card for "find_second_sent_train_50_eval_10_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
usmanyaqoob/Humman-Emotions-Dataset | ---
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: Emotions
size_categories:
- n<1K
--- |
jimmycarter/textocr-gpt4v | ---
license: cc-by-nc-4.0
language:
- en
pretty_name: textocr-gpt4v
task_categories:
- image-to-text
- visual-question-answering
size_categories:
- 10K<n<100K
---
# Dataset Card for TextOCR-GPT4V
## Dataset Description
- **Point of Contact:** APJC (me)
### Dataset Summary
TextOCR-GPT4V is Meta's [TextOCR dataset](https://textvqa.org/textocr/) dataset captioned with emphasis on text OCR using GPT4V. To get the image, you will need to agree to their terms of service.
### Supported Tasks
The TextOCR-GPT4V dataset is intended for generating benchmarks for comparison of an MLLM to GPT4v.
### Languages
The caption languages are in English, while various texts in images are in many languages such as Spanish, Japanese, and Hindi.
### Original Prompts
The `caption` field was produced with the following prompt with the `gpt-4-vision-preview` model:
```
Can you please describe the contents of this image in the following way: (1) In one to two sentences at most under the heading entitled 'DESCRIPTION' (2) Transcribe any text found within the image and where it is located under the heading entitled 'TEXT'?\n\nFor example, you might describe a picture of a palm tree with a logo on it in the center that spells the word COCONUT as:\n\nDESCRIPTION\nA photograph of a palm tree on a beach somewhere, there is a blue sky in the background and it is a sunny day. There is a blue text logo with white outline in the center of the image.\n\nTEXT\nThe text logo in the center of the image says, \"COCONUT\".\n\nBe sure to describe all the text that is found in the image.
```
The `caption_condensed` field was produced with the following prompt using the `gpt-4-1106-preview` model:
```
Please make the following description of an image that may or may not have text into a single description of 120 words or less.
{caption}
Be terse and do not add extraneous details. Keep the description as a single, unbroken paragraph.
```
### Data Instances
An example of "train" looks as follows:
```json
{
"filename": "aabbccddeeff0011.jpg",
"caption": "DESCRIPTION\nA banana.\n\nTEXT\nThe banana has a sticker on it that says \"Fruit Company\".",
"caption_image": "A banana.",
"caption_text": "The banana has a sticker on it that says \"Fruit Company\".",
"caption_condensed": "A banana that has a sticker on it that says \"Fruit Company\".",
}
```
### Data Fields
The data fields are as follows:
* `filename`: The filename of the image from the original [TextOCR dataset](https://textvqa.org/textocr/).
* `caption`: A caption with both a `DESCRIPTION` and `TEXT` part.
* `caption_image`: The `DESCRIPTION` part of the caption.
* `caption_text`: The `TEXT` part of the caption.
* `caption_condensed`: GPT4 distilled version of the original caption onto a single line.
### Data Splits
| | train |
|---------------|------:|
| textocr-gpt4v | 25114 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
The `textocr-gpt4v` data is generated by a vision-language model (`gpt-4-vision-preview`) and inevitably contains some errors or biases. We encourage users to use this data with caution and propose new methods to filter or improve the imperfections.
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
### Citation Information
```
@misc{textocr-gpt4v,
author = { Jimmy Carter },
title = {TextOCR-GPT4V},
year = {2024},
publisher = {Huggingface},
journal = {Huggingface repository},
howpublished = {\url{https://huggingface.co/datasets/jimmycarter/textocr-gpt4v}},
}
```
### Contributions
[More Information Needed] |
hlt-lab/dailydialogsample-jumble | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: reference
dtype: string
splits:
- name: train
num_bytes: 46955
num_examples: 100
download_size: 36773
dataset_size: 46955
---
# Dataset Card for "dailydialogsample-jumble"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5 | ---
pretty_name: Evaluation run of Tensoic/Kan-Llama-SFT-v0.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Tensoic/Kan-Llama-SFT-v0.5](https://huggingface.co/Tensoic/Kan-Llama-SFT-v0.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T01:43:44.197286](https://huggingface.co/datasets/open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5/blob/main/results_2024-01-24T01-43-44.197286.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4272736498001841,\n\
\ \"acc_stderr\": 0.03426594520244024,\n \"acc_norm\": 0.43301807406696846,\n\
\ \"acc_norm_stderr\": 0.03509638961981207,\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.4744031768522622,\n\
\ \"mc2_stderr\": 0.015238059013971565\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42918088737201365,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.47440273037542663,\n \"acc_norm_stderr\": 0.01459223088529896\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5372435769766979,\n\
\ \"acc_stderr\": 0.00497591966511654,\n \"acc_norm\": 0.7271459868552081,\n\
\ \"acc_norm_stderr\": 0.004445160997618376\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389177,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.44516129032258067,\n\
\ \"acc_stderr\": 0.028272410186214906,\n \"acc_norm\": 0.44516129032258067,\n\
\ \"acc_norm_stderr\": 0.028272410186214906\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03815494308688929,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03815494308688929\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056128,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056128\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396976,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095932,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095932\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.03156663099215416,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5229357798165137,\n \"acc_stderr\": 0.0214147570581755,\n \"acc_norm\"\
: 0.5229357798165137,\n \"acc_norm_stderr\": 0.0214147570581755\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n\
\ \"acc_stderr\": 0.032259413526312945,\n \"acc_norm\": 0.33796296296296297,\n\
\ \"acc_norm_stderr\": 0.032259413526312945\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.034849415144292316,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.034849415144292316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.030964810588786713,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.030964810588786713\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.03906947479456601,\n\
\ \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.03906947479456601\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458934,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458934\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6538461538461539,\n\
\ \"acc_stderr\": 0.0311669573672359,\n \"acc_norm\": 0.6538461538461539,\n\
\ \"acc_norm_stderr\": 0.0311669573672359\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5696040868454662,\n\
\ \"acc_stderr\": 0.017705868776292388,\n \"acc_norm\": 0.5696040868454662,\n\
\ \"acc_norm_stderr\": 0.017705868776292388\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4421965317919075,\n \"acc_stderr\": 0.0267386036438074,\n\
\ \"acc_norm\": 0.4421965317919075,\n \"acc_norm_stderr\": 0.0267386036438074\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761976,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761976\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.028541722692618874,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.028541722692618874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5144694533762058,\n\
\ \"acc_stderr\": 0.028386198084177673,\n \"acc_norm\": 0.5144694533762058,\n\
\ \"acc_norm_stderr\": 0.028386198084177673\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4228395061728395,\n \"acc_stderr\": 0.027487472980871605,\n\
\ \"acc_norm\": 0.4228395061728395,\n \"acc_norm_stderr\": 0.027487472980871605\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3272490221642764,\n\
\ \"acc_stderr\": 0.011983819806464733,\n \"acc_norm\": 0.3272490221642764,\n\
\ \"acc_norm_stderr\": 0.011983819806464733\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42320261437908496,\n \"acc_stderr\": 0.01998780976948206,\n \
\ \"acc_norm\": 0.42320261437908496,\n \"acc_norm_stderr\": 0.01998780976948206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278985,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.03093285879278985\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.03512310964123935,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.03512310964123935\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322416,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322416\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.03786720706234214,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.03786720706234214\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.4744031768522622,\n\
\ \"mc2_stderr\": 0.015238059013971565\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634472\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \
\ \"acc_stderr\": 0.0062163286402380944\n }\n}\n```"
repo_url: https://huggingface.co/Tensoic/Kan-Llama-SFT-v0.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|arc:challenge|25_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|gsm8k|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hellaswag|10_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T01-43-44.197286.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- '**/details_harness|winogrande|5_2024-01-24T01-43-44.197286.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T01-43-44.197286.parquet'
- config_name: results
data_files:
- split: 2024_01_24T01_43_44.197286
path:
- results_2024-01-24T01-43-44.197286.parquet
- split: latest
path:
- results_2024-01-24T01-43-44.197286.parquet
---
# Dataset Card for Evaluation run of Tensoic/Kan-Llama-SFT-v0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Tensoic/Kan-Llama-SFT-v0.5](https://huggingface.co/Tensoic/Kan-Llama-SFT-v0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T01:43:44.197286](https://huggingface.co/datasets/open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5/blob/main/results_2024-01-24T01-43-44.197286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4272736498001841,
"acc_stderr": 0.03426594520244024,
"acc_norm": 0.43301807406696846,
"acc_norm_stderr": 0.03509638961981207,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916912,
"mc2": 0.4744031768522622,
"mc2_stderr": 0.015238059013971565
},
"harness|arc:challenge|25": {
"acc": 0.42918088737201365,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.47440273037542663,
"acc_norm_stderr": 0.01459223088529896
},
"harness|hellaswag|10": {
"acc": 0.5372435769766979,
"acc_stderr": 0.00497591966511654,
"acc_norm": 0.7271459868552081,
"acc_norm_stderr": 0.004445160997618376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389177,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44516129032258067,
"acc_stderr": 0.028272410186214906,
"acc_norm": 0.44516129032258067,
"acc_norm_stderr": 0.028272410186214906
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03815494308688929,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03815494308688929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056128,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056128
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396976,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095932,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095932
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5229357798165137,
"acc_stderr": 0.0214147570581755,
"acc_norm": 0.5229357798165137,
"acc_norm_stderr": 0.0214147570581755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.032259413526312945,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.032259413526312945
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.034849415144292316,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.034849415144292316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.030964810588786713,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.030964810588786713
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44785276073619634,
"acc_stderr": 0.03906947479456601,
"acc_norm": 0.44785276073619634,
"acc_norm_stderr": 0.03906947479456601
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458934,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458934
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.0311669573672359,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.0311669573672359
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5696040868454662,
"acc_stderr": 0.017705868776292388,
"acc_norm": 0.5696040868454662,
"acc_norm_stderr": 0.017705868776292388
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4421965317919075,
"acc_stderr": 0.0267386036438074,
"acc_norm": 0.4421965317919075,
"acc_norm_stderr": 0.0267386036438074
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761976,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761976
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5144694533762058,
"acc_stderr": 0.028386198084177673,
"acc_norm": 0.5144694533762058,
"acc_norm_stderr": 0.028386198084177673
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4228395061728395,
"acc_stderr": 0.027487472980871605,
"acc_norm": 0.4228395061728395,
"acc_norm_stderr": 0.027487472980871605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3272490221642764,
"acc_stderr": 0.011983819806464733,
"acc_norm": 0.3272490221642764,
"acc_norm_stderr": 0.011983819806464733
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42320261437908496,
"acc_stderr": 0.01998780976948206,
"acc_norm": 0.42320261437908496,
"acc_norm_stderr": 0.01998780976948206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123935,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123935
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322416,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322416
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.03786720706234214,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.03786720706234214
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916912,
"mc2": 0.4744031768522622,
"mc2_stderr": 0.015238059013971565
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634472
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.0062163286402380944
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liswei/rm-static-zhTW | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt_zh
dtype: string
- name: response_zh
dtype: string
- name: chosen_zh
dtype: string
- name: rejected_zh
dtype: string
splits:
- name: train
num_bytes: 198602975
num_examples: 76256
- name: test
num_bytes: 13365684
num_examples: 5103
download_size: 129737844
dataset_size: 211968659
task_categories:
- text2text-generation
- text-generation
- text-classification
language:
- zh
pretty_name: rm-static-zhTW
size_categories:
- 10K<n<100K
tags:
- instruction-finetuning
- rlhf
---
# Dataset Card for "rm-static-m2m100-zh"
Traditional Chinese translation of the [Dahoas/rm-static](https://huggingface.co/datasets/Dahoas/rm-static) dataset.
The dataset is first translated into Simplified Chinese using [facebook/m2m100-12B-last-ckpt](https://huggingface.co/facebook/m2m100-12B-last-ckpt) and greedy decoding.
The translation is then filtered and further translated into Traditional Chinese using [OpenCC](https://github.com/BYVoid/OpenCC)
The dataset may contain samples with translation errors, we plan to release a filtered version of this dataset in the future. |
LahiruLowe/flan2021_explanation_targets_vilsonrodrigues_falcon7b_instructsharded | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 182504
num_examples: 136
download_size: 102439
dataset_size: 182504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "flan2021_explanation_targets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
miittnnss/test-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 18445.0
num_examples: 2
download_size: 20023
dataset_size: 18445.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
GEM-submissions/lewtun__this-is-a-test-submission__1656013291 | ---
benchmark: gem
type: prediction
submission_name: This is a test submission
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test submission
|
PSegs/psegs-ios-lidar-ext | ---
license: apache-2.0
size_categories:
- n<1K
---
# PSegs iOS Lidar Extension
[](https://opensource.org/licenses/Apache-2.0)
This project contains data captured using Lidar-equipped iPhone(s)
for use as an extension with the
[PSegs](https://github.com/pwais/psegs) project.
# Structure
* [threeDScannerApp_data](https://huggingface.co/datasets/PSegs/psegs-ios-lidar-ext/tree/main/threeDScannerApp_data) - This is test data captured
using the [3D Scanner App](https://3dscannerapp.com/) for iOS.
* [ps_external_test_fixtures](https://huggingface.co/datasets/PSegs/psegs-ios-lidar-ext/tree/main/ps_external_test_fixtures) - These are fixtures
created using the data in this repo and code in
[PSegs](https://github.com/pwais/psegs). They are hosted here and
provided to power [PSegs](https://github.com/pwais/psegs) unit tests. |
ekolasky/BlogClassForLSGSeqClass | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: start_positions
sequence: int64
- name: end_positions
sequence: int64
splits:
- name: train
num_bytes: 916965
num_examples: 127
download_size: 377932
dataset_size: 916965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/yusa_kozue_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yusa_kozue/遊佐こずえ/유사코즈에 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yusa_kozue/遊佐こずえ/유사코즈에 (THE iDOLM@STER: Cinderella Girls), containing 382 images and their tags.
The core tags of this character are `blonde_hair, green_eyes, ahoge, twintails, long_hair, low_twintails, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 382 | 475.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yusa_kozue_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 382 | 270.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yusa_kozue_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 893 | 576.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yusa_kozue_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 382 | 422.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yusa_kozue_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 893 | 843.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yusa_kozue_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yusa_kozue_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, cat_ears, cat_girl, cat_tail, dress, looking_at_viewer, open_mouth, solo, animal_ear_fluff, simple_background, white_background, bell, between_legs, collarbone, long_sleeves, sitting |
| 1 | 14 |  |  |  |  |  | 1girl, blush, solo, dress, open_mouth, looking_at_viewer, socks, sitting, wrist_cuffs |
| 2 | 18 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, open_mouth, white_background, brown_dress, hair_bow, simple_background, :d, frilled_dress, long_sleeves, plaid_dress, kneehighs, shoes, white_socks |
| 3 | 15 |  |  |  |  |  | blush, hair_flower, looking_at_viewer, 1girl, head_wreath, solo, navel, open_mouth, white_background, wrist_cuffs, skirt, bare_shoulders, dress, pink_flower, collarbone, :o, flower_necklace, simple_background, :d, fairy_wings, flower_wreath, sandals, shirt, swept_bangs, upper_body |
| 4 | 7 |  |  |  |  |  | 1girl, blush, loli, navel, simple_background, solo, white_background, groin, nipples, flat_chest, looking_at_viewer, open_mouth, ass_visible_through_thighs, clothes_lift, lifted_by_self, nude, pussy, shirt |
| 5 | 7 |  |  |  |  |  | 1girl, blush, loli, open_mouth, 1boy, flat_chest, hetero, navel, nipples, sex, spread_legs, censored, completely_nude, cum_in_pussy, penis, solo_focus, thighs, vaginal, girl_on_top, overflow, pov, smile, straddling |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | cat_ears | cat_girl | cat_tail | dress | looking_at_viewer | open_mouth | solo | animal_ear_fluff | simple_background | white_background | bell | between_legs | collarbone | long_sleeves | sitting | socks | wrist_cuffs | brown_dress | hair_bow | :d | frilled_dress | plaid_dress | kneehighs | shoes | white_socks | hair_flower | head_wreath | navel | skirt | bare_shoulders | pink_flower | :o | flower_necklace | fairy_wings | flower_wreath | sandals | shirt | swept_bangs | upper_body | loli | groin | nipples | flat_chest | ass_visible_through_thighs | clothes_lift | lifted_by_self | nude | pussy | 1boy | hetero | sex | spread_legs | censored | completely_nude | cum_in_pussy | penis | solo_focus | thighs | vaginal | girl_on_top | overflow | pov | smile | straddling |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------|:-----------|:-----------|:--------|:--------------------|:-------------|:-------|:-------------------|:--------------------|:-------------------|:-------|:---------------|:-------------|:---------------|:----------|:--------|:--------------|:--------------|:-----------|:-----|:----------------|:--------------|:------------|:--------|:--------------|:--------------|:--------------|:--------|:--------|:-----------------|:--------------|:-----|:------------------|:--------------|:----------------|:----------|:--------|:--------------|:-------------|:-------|:--------|:----------|:-------------|:-----------------------------|:---------------|:-----------------|:-------|:--------|:-------|:---------|:------|:--------------|:-----------|:------------------|:---------------|:--------|:-------------|:---------|:----------|:--------------|:-----------|:------|:--------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | | | | X | X | X | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | X | X | | | | | X | X | X | | X | X | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | | | | X | X | X | X | | X | X | | | X | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | | | X | X | X | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
romariocamilo/lulucaspre.mp3 | ---
license: openrail
---
|
5CD-AI/Vietnamese-nvidia-OpenMathInstruct-1-50k-gg-translated | ---
task_categories:
- text-generation
- question-answering
language:
- vi
- en
tags:
- math
- code
- nvidia
size_categories:
- 10K<n<100K
--- |
yangxg/test | ---
license: apache-2.0
task_categories:
- image-classification
- translation
language:
- en
tags:
- biology
size_categories:
- 10M<n<100M
--- |
emozilla/quality | ---
language: en
dataset_info:
features:
- name: article
dtype: string
- name: question
dtype: string
- name: options
sequence: string
- name: answer
dtype: int64
- name: hard
dtype: bool
splits:
- name: train
num_bytes: 62597212
num_examples: 2523
- name: validation
num_bytes: 51198650
num_examples: 2086
download_size: 14352147
dataset_size: 113795862
---
# Dataset Card for "quality"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/VALUE_rte_dey_it | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4759
num_examples: 12
- name: test
num_bytes: 47590
num_examples: 117
- name: train
num_bytes: 59365
num_examples: 125
download_size: 6768
dataset_size: 111714
---
# Dataset Card for "VALUE_rte_dey_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RaushanTurganbay/hw-intent-atis | ---
license: apache-2.0
---
|
stoddur/med_chat_10 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 250915440.0
num_examples: 162510
download_size: 6808373
dataset_size: 250915440.0
---
# Dataset Card for "med_chat_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/DTD_parition1_test_eachadea_vicuna_13b_1.1_mode_T_SPECIFIC_A_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 830475
num_examples: 1880
download_size: 183474
dataset_size: 830475
---
# Dataset Card for "DTD_parition1_test_eachadea_vicuna_13b_1.1_mode_T_SPECIFIC_A_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akadhim-ai/ios_icons | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 768688.0
num_examples: 10
download_size: 769873
dataset_size: 768688.0
---
# Dataset Card for "ios_icons"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-squad_v2-squad_v2-8571ec-1652758611 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: SupriyaArun/bert-base-uncased-finetuned-squad
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: SupriyaArun/bert-base-uncased-finetuned-squad
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Ryan20/hotel_dataset_pushed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: answers
sequence: string
- name: context
dtype: string
- name: questions
sequence: string
splits:
- name: train
num_bytes: 4634
num_examples: 7
download_size: 7932
dataset_size: 4634
---
# Dataset Card for "hotel_dataset_pushed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HazSylvia/Fitness_Unformatted | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- biology
- fitness
- fit
- gym
- health
pretty_name: Fitnes
size_categories:
- n<1K
--- |
HanxuHU/mmmu_de | ---
dataset_info:
- config_name: Accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1599764.0
num_examples: 30
download_size: 1536376
dataset_size: 1599764.0
- config_name: Agriculture
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 119218079.0
num_examples: 30
download_size: 119223778
dataset_size: 119218079.0
- config_name: Art_Theory
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 33481614.0
num_examples: 30
download_size: 29784258
dataset_size: 33481614.0
- config_name: Clinical_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 10883415.0
num_examples: 30
download_size: 10887096
dataset_size: 10883415.0
- config_name: Design
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 17923444.0
num_examples: 30
download_size: 16227890
dataset_size: 17923444.0
- config_name: Diagnostics_and_Laboratory_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 37106618.0
num_examples: 30
download_size: 37090475
dataset_size: 37106618.0
- config_name: Economics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1487985.0
num_examples: 30
download_size: 1425179
dataset_size: 1487985.0
- config_name: Energy_and_Power
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1643312.0
num_examples: 30
download_size: 1647583
dataset_size: 1643312.0
- config_name: Finance
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1072662.0
num_examples: 30
download_size: 1004589
dataset_size: 1072662.0
- config_name: Geography
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 6671745.0
num_examples: 30
download_size: 6678013
dataset_size: 6671745.0
- config_name: History
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8820529.0
num_examples: 30
download_size: 8430938
dataset_size: 8820529.0
- config_name: Literature
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 14241754.0
num_examples: 30
download_size: 14246959
dataset_size: 14241754.0
- config_name: Manage
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 3279887.0
num_examples: 30
download_size: 3142892
dataset_size: 3279887.0
- config_name: Marketing
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1474291.0
num_examples: 30
download_size: 1362031
dataset_size: 1474291.0
- config_name: Mechanical_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 876316.0
num_examples: 30
download_size: 878723
dataset_size: 876316.0
- config_name: Pharmacy
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1657130.0
num_examples: 30
download_size: 1551943
dataset_size: 1657130.0
- config_name: Physics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1115184.0
num_examples: 30
download_size: 1117717
dataset_size: 1115184.0
- config_name: Public_Health
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1510747.0
num_examples: 30
download_size: 1510884
dataset_size: 1510747.0
configs:
- config_name: Accounting
data_files:
- split: validation
path: Accounting/validation-*
- config_name: Agriculture
data_files:
- split: validation
path: Agriculture/validation-*
- config_name: Art_Theory
data_files:
- split: validation
path: Art_Theory/validation-*
- config_name: Clinical_Medicine
data_files:
- split: validation
path: Clinical_Medicine/validation-*
- config_name: Design
data_files:
- split: validation
path: Design/validation-*
- config_name: Diagnostics_and_Laboratory_Medicine
data_files:
- split: validation
path: Diagnostics_and_Laboratory_Medicine/validation-*
- config_name: Economics
data_files:
- split: validation
path: Economics/validation-*
- config_name: Energy_and_Power
data_files:
- split: validation
path: Energy_and_Power/validation-*
- config_name: Finance
data_files:
- split: validation
path: Finance/validation-*
- config_name: Geography
data_files:
- split: validation
path: Geography/validation-*
- config_name: History
data_files:
- split: validation
path: History/validation-*
- config_name: Literature
data_files:
- split: validation
path: Literature/validation-*
- config_name: Manage
data_files:
- split: validation
path: Manage/validation-*
- config_name: Marketing
data_files:
- split: validation
path: Marketing/validation-*
- config_name: Mechanical_Engineering
data_files:
- split: validation
path: Mechanical_Engineering/validation-*
- config_name: Pharmacy
data_files:
- split: validation
path: Pharmacy/validation-*
- config_name: Physics
data_files:
- split: validation
path: Physics/validation-*
- config_name: Public_Health
data_files:
- split: validation
path: Public_Health/validation-*
---
|
shujatoor/sroie_ocr | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1303802
num_examples: 5270
download_size: 599776
dataset_size: 1303802
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Venkateshwarang/Task2_Dataset | ---
dataset_info:
features:
- name: data
dtype: string
splits:
- name: train
num_bytes: 6674
num_examples: 17
download_size: 6315
dataset_size: 6674
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mouadse/medicare | ---
license: mit
---
|
shossain/merged-no-pad-text-32768 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 372426073
num_examples: 3036
download_size: 180967260
dataset_size: 372426073
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "merged-no-pad-text-32768"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2 | ---
pretty_name: Evaluation run of aloobun/bun_mistral_7b_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aloobun/bun_mistral_7b_v2](https://huggingface.co/aloobun/bun_mistral_7b_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T21:43:11.868828](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2/blob/main/results_2023-12-29T21-43-11.868828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6156047359789459,\n\
\ \"acc_stderr\": 0.03249111009517131,\n \"acc_norm\": 0.6209297452635882,\n\
\ \"acc_norm_stderr\": 0.03315335422122162,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40666362595991745,\n\
\ \"mc2_stderr\": 0.01440530497666933\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870655,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719869\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6362278430591516,\n\
\ \"acc_stderr\": 0.00480100965769044,\n \"acc_norm\": 0.8265285799641505,\n\
\ \"acc_norm_stderr\": 0.0037788044746059103\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n\
\ \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n\
\ \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n\
\ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834832,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23128491620111732,\n\
\ \"acc_stderr\": 0.014102223623152573,\n \"acc_norm\": 0.23128491620111732,\n\
\ \"acc_norm_stderr\": 0.014102223623152573\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.012700582404768223,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.012700582404768223\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553704,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40666362595991745,\n\
\ \"mc2_stderr\": 0.01440530497666933\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209404\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3525398028809704,\n \
\ \"acc_stderr\": 0.013159909755930317\n }\n}\n```"
repo_url: https://huggingface.co/aloobun/bun_mistral_7b_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-43-11.868828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- '**/details_harness|winogrande|5_2023-12-29T21-43-11.868828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T21-43-11.868828.parquet'
- config_name: results
data_files:
- split: 2023_12_29T21_43_11.868828
path:
- results_2023-12-29T21-43-11.868828.parquet
- split: latest
path:
- results_2023-12-29T21-43-11.868828.parquet
---
# Dataset Card for Evaluation run of aloobun/bun_mistral_7b_v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aloobun/bun_mistral_7b_v2](https://huggingface.co/aloobun/bun_mistral_7b_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:43:11.868828](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2/blob/main/results_2023-12-29T21-43-11.868828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6156047359789459,
"acc_stderr": 0.03249111009517131,
"acc_norm": 0.6209297452635882,
"acc_norm_stderr": 0.03315335422122162,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40666362595991745,
"mc2_stderr": 0.01440530497666933
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870655,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719869
},
"harness|hellaswag|10": {
"acc": 0.6362278430591516,
"acc_stderr": 0.00480100965769044,
"acc_norm": 0.8265285799641505,
"acc_norm_stderr": 0.0037788044746059103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834832,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23128491620111732,
"acc_stderr": 0.014102223623152573,
"acc_norm": 0.23128491620111732,
"acc_norm_stderr": 0.014102223623152573
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799215,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799215
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768223,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768223
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553704,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40666362595991745,
"mc2_stderr": 0.01440530497666933
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209404
},
"harness|gsm8k|5": {
"acc": 0.3525398028809704,
"acc_stderr": 0.013159909755930317
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
synthseq/flipflop | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: val_dense
path: data/val_dense-*
- split: val_sparse
path: data/val_sparse-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 825600000
num_examples: 1600000
- name: val
num_bytes: 8256000
num_examples: 16000
- name: val_dense
num_bytes: 2064000
num_examples: 4000
- name: val_sparse
num_bytes: 82560000
num_examples: 160000
download_size: 354675733
dataset_size: 918480000
---
Data for [**Flip-Flop Language Modeling**](https://arxiv.org/abs/2306.00946). The task is to correctly execute the sequential operations of a 1-bit register. The Transformer architecture, despite being apparently built for this operation, makes sporadic extrapolation errors (*attention glitches*). An open challenge is to fix these without recourse to long-tailed data or a recurrent architecture. Splits reflect the FFLM setup from the paper:
- `train`: 1.6M sequences from FFL(0.8) *(256 instructions, 80% ignore, 10% read, 10% write)*.
- `val`: 16K sequences from FFL(0.8).
- `val_dense`: 4K sequences from FFL(0.1).
- `val_sparse`: 160K sequences from FFL(0.98).
Usage
---
```python
import torch
import datasets
dataset = datasets.load_dataset('synthseq/flipflop')
dataset['train'][0] # {'text': 'w1i1w0i0 ...
def tokenize_batch(batch):
mapping = {'w': 0, 'r': 1, 'i': 2, '0': 3, '1': 4}
tokenized_batch = [[mapping[char] for char in s] for s in batch['text']]
return {'tokens': torch.tensor(tokenized_batch, dtype=torch.int64)}
dataset.set_transform(tokenize_batch)
dataset['train'][0] # {'tokens': tensor([0, 4, 2, 4, 0, 3, 2, 3, 2 ...
```
Citation
---
```
@article{liu2023exposing,
title={Exposing Attention Glitches with Flip-Flop Language Modeling},
author={Liu, Bingbin and Ash, Jordan T and Goel, Surbhi and Krishnamurthy, Akshay and Zhang, Cyril},
journal={arXiv preprint arXiv:2306.00946},
year={2023}
}
``` |
sproos/cosmopedia-100k-v0-activations | ---
dataset_info:
features:
- name: text
dtype: string
- name: embedding
sequence: float64
- name: activations
sequence: float64
splits:
- name: train
num_bytes: 72548823.52993
num_examples: 2993
download_size: 16765503
dataset_size: 72548823.52993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.