id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_migtissera__Synthia-7B | 2023-08-27T12:40:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-7B](https://huggingface.co/migtissera/Synthia-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T17:21:07.158534](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B/blob/main/results_2023-08-17T17%3A21%3A07.158534.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5056926539104564,\n\
\ \"acc_stderr\": 0.03526203275119529,\n \"acc_norm\": 0.5092682795407486,\n\
\ \"acc_norm_stderr\": 0.03524732365869811,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.4503268940046918,\n\
\ \"mc2_stderr\": 0.015174573803698735\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5366894197952219,\n \"acc_stderr\": 0.014572000527756994,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5997809201354312,\n\
\ \"acc_stderr\": 0.004889413126208774,\n \"acc_norm\": 0.7859988050189205,\n\
\ \"acc_norm_stderr\": 0.004092894578418982\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.030723535249006107,\n\
\ \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.030723535249006107\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264716,\n \"\
acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.034648816750163396,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.034648816750163396\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000752,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000752\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.673394495412844,\n \"acc_stderr\": 0.0201069908899373,\n \"acc_norm\"\
: 0.673394495412844,\n \"acc_norm_stderr\": 0.0201069908899373\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449296,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449296\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.698595146871009,\n\
\ \"acc_stderr\": 0.016409091097268784,\n \"acc_norm\": 0.698595146871009,\n\
\ \"acc_norm_stderr\": 0.016409091097268784\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952236,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952236\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468636,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468636\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617157,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401266,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806185,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806185\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3559322033898305,\n\
\ \"acc_stderr\": 0.01222864553727757,\n \"acc_norm\": 0.3559322033898305,\n\
\ \"acc_norm_stderr\": 0.01222864553727757\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5163398692810458,\n \"acc_stderr\": 0.02021703065318646,\n \"\
acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02021703065318646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.0344578996436275,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.0344578996436275\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.4503268940046918,\n\
\ \"mc2_stderr\": 0.015174573803698735\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|arc:challenge|25_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hellaswag|10_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:21:07.158534.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T17:21:07.158534.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T17:21:07.158534.parquet'
- config_name: results
data_files:
- split: 2023_08_17T17_21_07.158534
path:
- results_2023-08-17T17:21:07.158534.parquet
- split: latest
path:
- results_2023-08-17T17:21:07.158534.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-7B](https://huggingface.co/migtissera/Synthia-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T17:21:07.158534](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B/blob/main/results_2023-08-17T17%3A21%3A07.158534.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5056926539104564,
"acc_stderr": 0.03526203275119529,
"acc_norm": 0.5092682795407486,
"acc_norm_stderr": 0.03524732365869811,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.4503268940046918,
"mc2_stderr": 0.015174573803698735
},
"harness|arc:challenge|25": {
"acc": 0.5366894197952219,
"acc_stderr": 0.014572000527756994,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.5997809201354312,
"acc_stderr": 0.004889413126208774,
"acc_norm": 0.7859988050189205,
"acc_norm_stderr": 0.004092894578418982
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.030723535249006107,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.030723535249006107
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.034648816750163396,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.034648816750163396
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000752,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000752
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.673394495412844,
"acc_stderr": 0.0201069908899373,
"acc_norm": 0.673394495412844,
"acc_norm_stderr": 0.0201069908899373
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.032834720561085606,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.032834720561085606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449296,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449296
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.698595146871009,
"acc_stderr": 0.016409091097268784,
"acc_norm": 0.698595146871009,
"acc_norm_stderr": 0.016409091097268784
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952236,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952236
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468636,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468636
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617157,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401266,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.028999080904806185,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.028999080904806185
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3559322033898305,
"acc_stderr": 0.01222864553727757,
"acc_norm": 0.3559322033898305,
"acc_norm_stderr": 0.01222864553727757
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02021703065318646,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02021703065318646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.0344578996436275,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.0344578996436275
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.4503268940046918,
"mc2_stderr": 0.015174573803698735
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_migtissera__Synthia-13B | 2023-08-27T12:40:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-13B](https://huggingface.co/migtissera/Synthia-13B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T07:48:14.366837](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-13B/blob/main/results_2023-08-18T07%3A48%3A14.366837.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.562038864233033,\n\
\ \"acc_stderr\": 0.034464491996069525,\n \"acc_norm\": 0.5661067245543279,\n\
\ \"acc_norm_stderr\": 0.03444423230659774,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.4741242810932586,\n\
\ \"mc2_stderr\": 0.015240307440730938\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344081,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809181\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6237801234813782,\n\
\ \"acc_stderr\": 0.004834461997944859,\n \"acc_norm\": 0.8185620394343757,\n\
\ \"acc_norm_stderr\": 0.003845930169643794\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.018175110510343574,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.018175110510343574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652247,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652247\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.02618966696627204,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.02618966696627204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4759776536312849,\n\
\ \"acc_stderr\": 0.016703190189300186,\n \"acc_norm\": 0.4759776536312849,\n\
\ \"acc_norm_stderr\": 0.016703190189300186\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825929,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825929\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994099,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994099\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4002607561929596,\n\
\ \"acc_stderr\": 0.012513582529136213,\n \"acc_norm\": 0.4002607561929596,\n\
\ \"acc_norm_stderr\": 0.012513582529136213\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468307,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.01997742260022747,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.01997742260022747\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.4741242810932586,\n\
\ \"mc2_stderr\": 0.015240307440730938\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:48:14.366837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:48:14.366837.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:48:14.366837.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:48:14.366837.parquet'
- config_name: results
data_files:
- split: 2023_08_18T07_48_14.366837
path:
- results_2023-08-18T07:48:14.366837.parquet
- split: latest
path:
- results_2023-08-18T07:48:14.366837.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-13B](https://huggingface.co/migtissera/Synthia-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T07:48:14.366837](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-13B/blob/main/results_2023-08-18T07%3A48%3A14.366837.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.562038864233033,
"acc_stderr": 0.034464491996069525,
"acc_norm": 0.5661067245543279,
"acc_norm_stderr": 0.03444423230659774,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.4741242810932586,
"mc2_stderr": 0.015240307440730938
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344081,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809181
},
"harness|hellaswag|10": {
"acc": 0.6237801234813782,
"acc_stderr": 0.004834461997944859,
"acc_norm": 0.8185620394343757,
"acc_norm_stderr": 0.003845930169643794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.03452453903822039,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.03452453903822039
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836557,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836557
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.018175110510343574,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.018175110510343574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652247,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652247
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.02618966696627204,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.02618966696627204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4759776536312849,
"acc_stderr": 0.016703190189300186,
"acc_norm": 0.4759776536312849,
"acc_norm_stderr": 0.016703190189300186
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994099,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994099
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4002607561929596,
"acc_stderr": 0.012513582529136213,
"acc_norm": 0.4002607561929596,
"acc_norm_stderr": 0.012513582529136213
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468307,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.01997742260022747,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.01997742260022747
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.4741242810932586,
"mc2_stderr": 0.015240307440730938
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_augtoma__qCammel-70x | 2023-09-24T00:38:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of augtoma/qCammel-70x
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [augtoma/qCammel-70x](https://huggingface.co/augtoma/qCammel-70x) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augtoma__qCammel-70x\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-24T00:38:03.634221](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70x/blob/main/results_2023-09-24T00-38-03.634221.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.033766778523489936,\n\
\ \"em_stderr\": 0.001849802869119515,\n \"f1\": 0.10340918624161041,\n\
\ \"f1_stderr\": 0.0022106009828094797,\n \"acc\": 0.5700654570173166,\n\
\ \"acc_stderr\": 0.011407494958111332\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.033766778523489936,\n \"em_stderr\": 0.001849802869119515,\n\
\ \"f1\": 0.10340918624161041,\n \"f1_stderr\": 0.0022106009828094797\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2971948445792267,\n \
\ \"acc_stderr\": 0.012588685966624186\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598479\n\
\ }\n}\n```"
repo_url: https://huggingface.co/augtoma/qCammel-70x
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|arc:challenge|25_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_24T00_38_03.634221
path:
- '**/details_harness|drop|3_2023-09-24T00-38-03.634221.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-24T00-38-03.634221.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_24T00_38_03.634221
path:
- '**/details_harness|gsm8k|5_2023-09-24T00-38-03.634221.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-24T00-38-03.634221.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hellaswag|10_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T05:27:12.496393.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T05:27:12.496393.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T05:27:12.496393.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_24T00_38_03.634221
path:
- '**/details_harness|winogrande|5_2023-09-24T00-38-03.634221.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-24T00-38-03.634221.parquet'
- config_name: results
data_files:
- split: 2023_08_18T05_27_12.496393
path:
- results_2023-08-18T05:27:12.496393.parquet
- split: 2023_09_24T00_38_03.634221
path:
- results_2023-09-24T00-38-03.634221.parquet
- split: latest
path:
- results_2023-09-24T00-38-03.634221.parquet
---
# Dataset Card for Evaluation run of augtoma/qCammel-70x
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/augtoma/qCammel-70x
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [augtoma/qCammel-70x](https://huggingface.co/augtoma/qCammel-70x) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_augtoma__qCammel-70x",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-24T00:38:03.634221](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70x/blob/main/results_2023-09-24T00-38-03.634221.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.033766778523489936,
"em_stderr": 0.001849802869119515,
"f1": 0.10340918624161041,
"f1_stderr": 0.0022106009828094797,
"acc": 0.5700654570173166,
"acc_stderr": 0.011407494958111332
},
"harness|drop|3": {
"em": 0.033766778523489936,
"em_stderr": 0.001849802869119515,
"f1": 0.10340918624161041,
"f1_stderr": 0.0022106009828094797
},
"harness|gsm8k|5": {
"acc": 0.2971948445792267,
"acc_stderr": 0.012588685966624186
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598479
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_augtoma__qCammel70 | 2023-08-27T12:40:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of augtoma/qCammel70
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [augtoma/qCammel70](https://huggingface.co/augtoma/qCammel70) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augtoma__qCammel70\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T06:33:28.828480](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel70/blob/main/results_2023-08-18T06%3A33%3A28.828480.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7005183426435676,\n\
\ \"acc_stderr\": 0.030906375362302115,\n \"acc_norm\": 0.7044769925331555,\n\
\ \"acc_norm_stderr\": 0.030875964993930118,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.01716027390169365,\n \"mc2\": 0.5746873588951067,\n\
\ \"mc2_stderr\": 0.0145465597784753\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839159,\n\
\ \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068079\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6878111929894444,\n\
\ \"acc_stderr\": 0.004624393690966901,\n \"acc_norm\": 0.8787094204341764,\n\
\ \"acc_norm_stderr\": 0.0032579745937899455\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02967416752010147,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02967416752010147\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745643,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745643\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777028,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777028\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528437,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528437\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822523,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8972477064220183,\n \"acc_stderr\": 0.013018246509173768,\n \"\
acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.013018246509173768\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758545,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758545\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951539,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951539\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.032484700838071943,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.032484700838071943\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.02876748172598386,\n\
\ \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.02876748172598386\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941635,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941635\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.0122343845868565,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.0122343845868565\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.021393961404363847,\n\
\ \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.021393961404363847\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5608938547486033,\n\
\ \"acc_stderr\": 0.016598022120580435,\n \"acc_norm\": 0.5608938547486033,\n\
\ \"acc_norm_stderr\": 0.016598022120580435\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398215,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398215\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.02042395535477803,\n\
\ \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.02042395535477803\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5602836879432624,\n \"acc_stderr\": 0.029609912075594113,\n \
\ \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.029609912075594113\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559322033898305,\n\
\ \"acc_stderr\": 0.012680037994097042,\n \"acc_norm\": 0.559322033898305,\n\
\ \"acc_norm_stderr\": 0.012680037994097042\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.026040662474201257,\n\
\ \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.026040662474201257\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.01740181671142765,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.01740181671142765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.0259911176728133,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.0259911176728133\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02410338420207286,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02410338420207286\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.01716027390169365,\n \"mc2\": 0.5746873588951067,\n\
\ \"mc2_stderr\": 0.0145465597784753\n }\n}\n```"
repo_url: https://huggingface.co/augtoma/qCammel70
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|arc:challenge|25_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hellaswag|10_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:33:28.828480.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:33:28.828480.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T06:33:28.828480.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T06:33:28.828480.parquet'
- config_name: results
data_files:
- split: 2023_08_18T06_33_28.828480
path:
- results_2023-08-18T06:33:28.828480.parquet
- split: latest
path:
- results_2023-08-18T06:33:28.828480.parquet
---
# Dataset Card for Evaluation run of augtoma/qCammel70
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/augtoma/qCammel70
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [augtoma/qCammel70](https://huggingface.co/augtoma/qCammel70) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_augtoma__qCammel70",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T06:33:28.828480](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel70/blob/main/results_2023-08-18T06%3A33%3A28.828480.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7005183426435676,
"acc_stderr": 0.030906375362302115,
"acc_norm": 0.7044769925331555,
"acc_norm_stderr": 0.030875964993930118,
"mc1": 0.401468788249694,
"mc1_stderr": 0.01716027390169365,
"mc2": 0.5746873588951067,
"mc2_stderr": 0.0145465597784753
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839159,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.013592431519068079
},
"harness|hellaswag|10": {
"acc": 0.6878111929894444,
"acc_stderr": 0.004624393690966901,
"acc_norm": 0.8787094204341764,
"acc_norm_stderr": 0.0032579745937899455
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02967416752010147,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02967416752010147
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745643,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745643
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777028,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777028
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528437,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528437
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822523,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.013018246509173768,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.013018246509173768
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758545,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758545
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383602,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383602
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951539,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951539
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.032484700838071943,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.032484700838071943
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8404907975460123,
"acc_stderr": 0.02876748172598386,
"acc_norm": 0.8404907975460123,
"acc_norm_stderr": 0.02876748172598386
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941635,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941635
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.0122343845868565,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.0122343845868565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.021393961404363847,
"acc_norm": 0.8034682080924855,
"acc_norm_stderr": 0.021393961404363847
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5608938547486033,
"acc_stderr": 0.016598022120580435,
"acc_norm": 0.5608938547486033,
"acc_norm_stderr": 0.016598022120580435
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398215,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398215
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.02042395535477803,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.02042395535477803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.029609912075594113,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.029609912075594113
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.559322033898305,
"acc_stderr": 0.012680037994097042,
"acc_norm": 0.559322033898305,
"acc_norm_stderr": 0.012680037994097042
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.026040662474201257,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.026040662474201257
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.0259911176728133,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.0259911176728133
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02410338420207286,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02410338420207286
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.01716027390169365,
"mc2": 0.5746873588951067,
"mc2_stderr": 0.0145465597784753
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16 | 2023-08-27T12:40:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-openllama-3b-v10-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-openllama-3b-v10-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T14:16:36.275338](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16/blob/main/results_2023-08-17T14%3A16%3A36.275338.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2441189334942959,\n\
\ \"acc_stderr\": 0.03111544871322134,\n \"acc_norm\": 0.24680047198134006,\n\
\ \"acc_norm_stderr\": 0.031119047644332574,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4204215849787592,\n\
\ \"mc2_stderr\": 0.014857530477655869\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.33447098976109213,\n \"acc_stderr\": 0.01378746032244138,\n\
\ \"acc_norm\": 0.3626279863481229,\n \"acc_norm_stderr\": 0.014049106564955009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45379406492730534,\n\
\ \"acc_stderr\": 0.004968429476345024,\n \"acc_norm\": 0.5838478390758813,\n\
\ \"acc_norm_stderr\": 0.00491912016939434\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111836,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111836\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.031862098516411454,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.031862098516411454\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262883,\n \"\
acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262883\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1935483870967742,\n \"acc_stderr\": 0.02247525852553606,\n \"\
acc_norm\": 0.1935483870967742,\n \"acc_norm_stderr\": 0.02247525852553606\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.1625615763546798,\n \"acc_stderr\": 0.025960300064605573,\n \"\
acc_norm\": 0.1625615763546798,\n \"acc_norm_stderr\": 0.025960300064605573\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.24242424242424243,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117447,\n\
\ \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026938,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026938\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996586,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996586\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217888,\n \"\
acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217888\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.02541642838876748,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.02541642838876748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395592,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891145,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891145\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044276,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044276\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.01455155365936992,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.01455155365936992\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.01091640673547895,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.01091640673547895\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n\
\ \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.21224489795918366,\n\
\ \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.21224489795918366,\n\
\ \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n\
\ \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n\
\ \"mc2\": 0.4204215849787592,\n \"mc2_stderr\": 0.014857530477655869\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:16:36.275338.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:16:36.275338.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:16:36.275338.parquet'
- config_name: results
data_files:
- split: 2023_08_17T14_16_36.275338
path:
- results_2023-08-17T14:16:36.275338.parquet
- split: latest
path:
- results_2023-08-17T14:16:36.275338.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-openllama-3b-v10-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-openllama-3b-v10-bf16](https://huggingface.co/OpenBuddy/openbuddy-openllama-3b-v10-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T14:16:36.275338](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-openllama-3b-v10-bf16/blob/main/results_2023-08-17T14%3A16%3A36.275338.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2441189334942959,
"acc_stderr": 0.03111544871322134,
"acc_norm": 0.24680047198134006,
"acc_norm_stderr": 0.031119047644332574,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4204215849787592,
"mc2_stderr": 0.014857530477655869
},
"harness|arc:challenge|25": {
"acc": 0.33447098976109213,
"acc_stderr": 0.01378746032244138,
"acc_norm": 0.3626279863481229,
"acc_norm_stderr": 0.014049106564955009
},
"harness|hellaswag|10": {
"acc": 0.45379406492730534,
"acc_stderr": 0.004968429476345024,
"acc_norm": 0.5838478390758813,
"acc_norm_stderr": 0.00491912016939434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111836,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111836
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.031862098516411454,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.031862098516411454
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342347,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342347
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262883,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262883
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1935483870967742,
"acc_stderr": 0.02247525852553606,
"acc_norm": 0.1935483870967742,
"acc_norm_stderr": 0.02247525852553606
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1625615763546798,
"acc_stderr": 0.025960300064605573,
"acc_norm": 0.1625615763546798,
"acc_norm_stderr": 0.025960300064605573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117447,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026938,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026938
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996586,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996586
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1981651376146789,
"acc_stderr": 0.017090573804217888,
"acc_norm": 0.1981651376146789,
"acc_norm_stderr": 0.017090573804217888
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.02541642838876748,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.02541642838876748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891145,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891145
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044276,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044276
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.01455155365936992,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.01455155365936992
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480768,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.01091640673547895,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.01091640673547895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4204215849787592,
"mc2_stderr": 0.014857530477655869
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16 | 2023-08-27T12:40:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-atom-13b-v9-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-atom-13b-v9-bf16](https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:31:32.257089](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16/blob/main/results_2023-08-17T18%3A31%3A32.257089.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49452023642023396,\n\
\ \"acc_stderr\": 0.03491073195668355,\n \"acc_norm\": 0.4981238638390924,\n\
\ \"acc_norm_stderr\": 0.0348991272574129,\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.4865705748891018,\n\
\ \"mc2_stderr\": 0.01500493816042778\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.492320819112628,\n \"acc_stderr\": 0.014609667440892577,\n\
\ \"acc_norm\": 0.5119453924914675,\n \"acc_norm_stderr\": 0.014607220340597167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5669189404501095,\n\
\ \"acc_stderr\": 0.004944889545497956,\n \"acc_norm\": 0.7599083847839075,\n\
\ \"acc_norm_stderr\": 0.004262659388824526\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851316,\n\
\ \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851316\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278006,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278006\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5707070707070707,\n \"acc_stderr\": 0.03526552724601199,\n \"\
acc_norm\": 0.5707070707070707,\n \"acc_norm_stderr\": 0.03526552724601199\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041154,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097866,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097866\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6422018348623854,\n \"acc_stderr\": 0.02055206078482783,\n \"\
acc_norm\": 0.6422018348623854,\n \"acc_norm_stderr\": 0.02055206078482783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298804,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298804\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591519,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591519\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185812,\n \
\ \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185812\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978814,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978814\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598642,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149123,\n\
\ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149123\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n\
\ \"acc_stderr\": 0.014125968754673403,\n \"acc_norm\": 0.2324022346368715,\n\
\ \"acc_norm_stderr\": 0.014125968754673403\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5337620578778135,\n\
\ \"acc_stderr\": 0.028333277109562793,\n \"acc_norm\": 0.5337620578778135,\n\
\ \"acc_norm_stderr\": 0.028333277109562793\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871584,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871584\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3833116036505867,\n\
\ \"acc_stderr\": 0.01241760366290119,\n \"acc_norm\": 0.3833116036505867,\n\
\ \"acc_norm_stderr\": 0.01241760366290119\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.02997280717046463,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.02997280717046463\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5032679738562091,\n \"acc_stderr\": 0.020227402794434867,\n \
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.020227402794434867\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.0163391703732809,\n \"mc2\": 0.4865705748891018,\n\
\ \"mc2_stderr\": 0.01500493816042778\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:31:32.257089.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:31:32.257089.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:31:32.257089.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_31_32.257089
path:
- results_2023-08-17T18:31:32.257089.parquet
- split: latest
path:
- results_2023-08-17T18:31:32.257089.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-atom-13b-v9-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-atom-13b-v9-bf16](https://huggingface.co/OpenBuddy/openbuddy-atom-13b-v9-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:31:32.257089](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-atom-13b-v9-bf16/blob/main/results_2023-08-17T18%3A31%3A32.257089.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49452023642023396,
"acc_stderr": 0.03491073195668355,
"acc_norm": 0.4981238638390924,
"acc_norm_stderr": 0.0348991272574129,
"mc1": 0.32068543451652387,
"mc1_stderr": 0.0163391703732809,
"mc2": 0.4865705748891018,
"mc2_stderr": 0.01500493816042778
},
"harness|arc:challenge|25": {
"acc": 0.492320819112628,
"acc_stderr": 0.014609667440892577,
"acc_norm": 0.5119453924914675,
"acc_norm_stderr": 0.014607220340597167
},
"harness|hellaswag|10": {
"acc": 0.5669189404501095,
"acc_stderr": 0.004944889545497956,
"acc_norm": 0.7599083847839075,
"acc_norm_stderr": 0.004262659388824526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.030770900763851316,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.030770900763851316
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278006,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278006
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5707070707070707,
"acc_stderr": 0.03526552724601199,
"acc_norm": 0.5707070707070707,
"acc_norm_stderr": 0.03526552724601199
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097866,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097866
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6422018348623854,
"acc_stderr": 0.02055206078482783,
"acc_norm": 0.6422018348623854,
"acc_norm_stderr": 0.02055206078482783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298804,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298804
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591519,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591519
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6455696202531646,
"acc_stderr": 0.031137304297185812,
"acc_norm": 0.6455696202531646,
"acc_norm_stderr": 0.031137304297185812
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978814,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978814
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598642,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.026917296179149123,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.026917296179149123
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.014125968754673403,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.014125968754673403
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5337620578778135,
"acc_stderr": 0.028333277109562793,
"acc_norm": 0.5337620578778135,
"acc_norm_stderr": 0.028333277109562793
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871584,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871584
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3833116036505867,
"acc_stderr": 0.01241760366290119,
"acc_norm": 0.3833116036505867,
"acc_norm_stderr": 0.01241760366290119
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.02997280717046463,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.02997280717046463
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.020227402794434867,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.020227402794434867
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32068543451652387,
"mc1_stderr": 0.0163391703732809,
"mc2": 0.4865705748891018,
"mc2_stderr": 0.01500493816042778
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_HWERI__pythia-1.4b-deduped-sharegpt | 2023-09-16T20:15:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of HWERI/pythia-1.4b-deduped-sharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HWERI/pythia-1.4b-deduped-sharegpt](https://huggingface.co/HWERI/pythia-1.4b-deduped-sharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HWERI__pythia-1.4b-deduped-sharegpt\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T20:15:27.580598](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-1.4b-deduped-sharegpt/blob/main/results_2023-09-16T20-15-27.580598.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652192217,\n \"f1\": 0.04875104865771823,\n\
\ \"f1_stderr\": 0.0012458540332815637,\n \"acc\": 0.2804129195481258,\n\
\ \"acc_stderr\": 0.008239894933698364\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192217,\n\
\ \"f1\": 0.04875104865771823,\n \"f1_stderr\": 0.0012458540332815637\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \
\ \"acc_stderr\": 0.002504942226860534\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5524861878453039,\n \"acc_stderr\": 0.013974847640536194\n\
\ }\n}\n```"
repo_url: https://huggingface.co/HWERI/pythia-1.4b-deduped-sharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T20_15_27.580598
path:
- '**/details_harness|drop|3_2023-09-16T20-15-27.580598.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T20-15-27.580598.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T20_15_27.580598
path:
- '**/details_harness|gsm8k|5_2023-09-16T20-15-27.580598.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T20-15-27.580598.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:24:42.073512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:24:42.073512.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:24:42.073512.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T20_15_27.580598
path:
- '**/details_harness|winogrande|5_2023-09-16T20-15-27.580598.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T20-15-27.580598.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_24_42.073512
path:
- results_2023-08-17T18:24:42.073512.parquet
- split: 2023_09_16T20_15_27.580598
path:
- results_2023-09-16T20-15-27.580598.parquet
- split: latest
path:
- results_2023-09-16T20-15-27.580598.parquet
---
# Dataset Card for Evaluation run of HWERI/pythia-1.4b-deduped-sharegpt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HWERI/pythia-1.4b-deduped-sharegpt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HWERI/pythia-1.4b-deduped-sharegpt](https://huggingface.co/HWERI/pythia-1.4b-deduped-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HWERI__pythia-1.4b-deduped-sharegpt",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T20:15:27.580598](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-1.4b-deduped-sharegpt/blob/main/results_2023-09-16T20-15-27.580598.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192217,
"f1": 0.04875104865771823,
"f1_stderr": 0.0012458540332815637,
"acc": 0.2804129195481258,
"acc_stderr": 0.008239894933698364
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192217,
"f1": 0.04875104865771823,
"f1_stderr": 0.0012458540332815637
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860534
},
"harness|winogrande|5": {
"acc": 0.5524861878453039,
"acc_stderr": 0.013974847640536194
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_grimpep__llama2-22B-GPLATTY | 2023-08-27T12:40:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of grimpep/llama2-22B-GPLATTY
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [grimpep/llama2-22B-GPLATTY](https://huggingface.co/grimpep/llama2-22B-GPLATTY)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grimpep__llama2-22B-GPLATTY\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T14:31:52.312230](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__llama2-22B-GPLATTY/blob/main/results_2023-08-17T14%3A31%3A52.312230.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5471453447112705,\n\
\ \"acc_stderr\": 0.034452241209601206,\n \"acc_norm\": 0.550874294679223,\n\
\ \"acc_norm_stderr\": 0.03443332656790291,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.4692973392633332,\n\
\ \"mc2_stderr\": 0.0156700439246235\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.560580204778157,\n \"acc_stderr\": 0.014503747823580123,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6290579565823541,\n\
\ \"acc_stderr\": 0.004820697457420421,\n \"acc_norm\": 0.8200557657837084,\n\
\ \"acc_norm_stderr\": 0.003833559228158675\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.03024223380085449,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.03024223380085449\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845697,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845697\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.01906909836319144,\n \"acc_norm\"\
: 0.728440366972477,\n \"acc_norm_stderr\": 0.01906909836319144\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.41203703703703703,\n\
\ \"acc_stderr\": 0.03356787758160835,\n \"acc_norm\": 0.41203703703703703,\n\
\ \"acc_norm_stderr\": 0.03356787758160835\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n\
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734195,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734195\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.016018239710513405,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.016018239710513405\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971646,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971646\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.0268228017595079,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.0268228017595079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612493,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612493\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.012510181636960672,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.012510181636960672\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \"\
acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387634,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387634\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.4692973392633332,\n\
\ \"mc2_stderr\": 0.0156700439246235\n }\n}\n```"
repo_url: https://huggingface.co/grimpep/llama2-22B-GPLATTY
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:31:52.312230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:31:52.312230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:31:52.312230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:31:52.312230.parquet'
- config_name: results
data_files:
- split: 2023_08_17T14_31_52.312230
path:
- results_2023-08-17T14:31:52.312230.parquet
- split: latest
path:
- results_2023-08-17T14:31:52.312230.parquet
---
# Dataset Card for Evaluation run of grimpep/llama2-22B-GPLATTY
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/grimpep/llama2-22B-GPLATTY
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [grimpep/llama2-22B-GPLATTY](https://huggingface.co/grimpep/llama2-22B-GPLATTY) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grimpep__llama2-22B-GPLATTY",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T14:31:52.312230](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__llama2-22B-GPLATTY/blob/main/results_2023-08-17T14%3A31%3A52.312230.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5471453447112705,
"acc_stderr": 0.034452241209601206,
"acc_norm": 0.550874294679223,
"acc_norm_stderr": 0.03443332656790291,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.4692973392633332,
"mc2_stderr": 0.0156700439246235
},
"harness|arc:challenge|25": {
"acc": 0.560580204778157,
"acc_stderr": 0.014503747823580123,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.6290579565823541,
"acc_stderr": 0.004820697457420421,
"acc_norm": 0.8200557657837084,
"acc_norm_stderr": 0.003833559228158675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.03024223380085449,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.03024223380085449
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845697,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845697
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.01906909836319144,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.01906909836319144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734195,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734195
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513405,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513405
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971646,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.0268228017595079,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.0268228017595079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612493,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612493
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.012510181636960672,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.012510181636960672
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387634,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387634
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.4692973392633332,
"mc2_stderr": 0.0156700439246235
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_grimpep__llama2-22b-wizard_vicuna | 2023-08-27T12:40:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of grimpep/llama2-22b-wizard_vicuna
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [grimpep/llama2-22b-wizard_vicuna](https://huggingface.co/grimpep/llama2-22b-wizard_vicuna)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grimpep__llama2-22b-wizard_vicuna\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T14:12:20.144901](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__llama2-22b-wizard_vicuna/blob/main/results_2023-08-17T14%3A12%3A20.144901.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5471453447112705,\n\
\ \"acc_stderr\": 0.034452241209601206,\n \"acc_norm\": 0.550874294679223,\n\
\ \"acc_norm_stderr\": 0.03443332656790291,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.4692973392633332,\n\
\ \"mc2_stderr\": 0.0156700439246235\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.560580204778157,\n \"acc_stderr\": 0.014503747823580123,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6290579565823541,\n\
\ \"acc_stderr\": 0.004820697457420421,\n \"acc_norm\": 0.8200557657837084,\n\
\ \"acc_norm_stderr\": 0.003833559228158675\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.03024223380085449,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.03024223380085449\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845697,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845697\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.01906909836319144,\n \"acc_norm\"\
: 0.728440366972477,\n \"acc_norm_stderr\": 0.01906909836319144\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.41203703703703703,\n\
\ \"acc_stderr\": 0.03356787758160835,\n \"acc_norm\": 0.41203703703703703,\n\
\ \"acc_norm_stderr\": 0.03356787758160835\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n\
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734195,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734195\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.016018239710513405,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.016018239710513405\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971646,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971646\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.0268228017595079,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.0268228017595079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612493,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612493\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.012510181636960672,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.012510181636960672\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \"\
acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387634,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387634\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.4692973392633332,\n\
\ \"mc2_stderr\": 0.0156700439246235\n }\n}\n```"
repo_url: https://huggingface.co/grimpep/llama2-22b-wizard_vicuna
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:12:20.144901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:12:20.144901.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:12:20.144901.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:12:20.144901.parquet'
- config_name: results
data_files:
- split: 2023_08_17T14_12_20.144901
path:
- results_2023-08-17T14:12:20.144901.parquet
- split: latest
path:
- results_2023-08-17T14:12:20.144901.parquet
---
# Dataset Card for Evaluation run of grimpep/llama2-22b-wizard_vicuna
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/grimpep/llama2-22b-wizard_vicuna
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [grimpep/llama2-22b-wizard_vicuna](https://huggingface.co/grimpep/llama2-22b-wizard_vicuna) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grimpep__llama2-22b-wizard_vicuna",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T14:12:20.144901](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__llama2-22b-wizard_vicuna/blob/main/results_2023-08-17T14%3A12%3A20.144901.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5471453447112705,
"acc_stderr": 0.034452241209601206,
"acc_norm": 0.550874294679223,
"acc_norm_stderr": 0.03443332656790291,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.4692973392633332,
"mc2_stderr": 0.0156700439246235
},
"harness|arc:challenge|25": {
"acc": 0.560580204778157,
"acc_stderr": 0.014503747823580123,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.6290579565823541,
"acc_stderr": 0.004820697457420421,
"acc_norm": 0.8200557657837084,
"acc_norm_stderr": 0.003833559228158675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.03024223380085449,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.03024223380085449
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845697,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845697
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.01906909836319144,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.01906909836319144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734195,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734195
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.016018239710513405,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.016018239710513405
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971646,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.0268228017595079,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.0268228017595079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612493,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612493
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.012510181636960672,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.012510181636960672
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387634,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387634
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.4692973392633332,
"mc2_stderr": 0.0156700439246235
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Enno-Ai__ennodata-7b | 2023-08-27T12:40:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Enno-Ai/ennodata-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Enno-Ai/ennodata-7b](https://huggingface.co/Enno-Ai/ennodata-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Enno-Ai__ennodata-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:21:05.699051](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__ennodata-7b/blob/main/results_2023-08-17T18%3A21%3A05.699051.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3456994326510667,\n\
\ \"acc_stderr\": 0.03418929248026247,\n \"acc_norm\": 0.34974830348542335,\n\
\ \"acc_norm_stderr\": 0.03417639724082498,\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427686,\n \"mc2\": 0.3353289270087254,\n\
\ \"mc2_stderr\": 0.013074362091466094\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47440273037542663,\n \"acc_stderr\": 0.014592230885298964,\n\
\ \"acc_norm\": 0.5102389078498294,\n \"acc_norm_stderr\": 0.014608326906285012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5731925911173074,\n\
\ \"acc_stderr\": 0.004936029827672036,\n \"acc_norm\": 0.7762397928699463,\n\
\ \"acc_norm_stderr\": 0.0041591146798738285\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32452830188679244,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.32452830188679244,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3063583815028902,\n\
\ \"acc_stderr\": 0.035149425512674366,\n \"acc_norm\": 0.3063583815028902,\n\
\ \"acc_norm_stderr\": 0.035149425512674366\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746304,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746304\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111394,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3258064516129032,\n\
\ \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.3258064516129032,\n\
\ \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070643,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070643\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.39378238341968913,\n \"acc_stderr\": 0.03526077095548237,\n\
\ \"acc_norm\": 0.39378238341968913,\n \"acc_norm_stderr\": 0.03526077095548237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886845,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886845\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.42935779816513764,\n\
\ \"acc_stderr\": 0.021222286397236504,\n \"acc_norm\": 0.42935779816513764,\n\
\ \"acc_norm_stderr\": 0.021222286397236504\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n\
\ \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3284313725490196,\n \"acc_stderr\": 0.03296245110172228,\n \"\
acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.03296245110172228\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4050632911392405,\n \"acc_stderr\": 0.031955147413706725,\n \
\ \"acc_norm\": 0.4050632911392405,\n \"acc_norm_stderr\": 0.031955147413706725\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.04142313771996664,\n\
\ \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.04142313771996664\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.512396694214876,\n \"acc_stderr\": 0.04562951548180765,\n \"acc_norm\"\
: 0.512396694214876,\n \"acc_norm_stderr\": 0.04562951548180765\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04712821257426771,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04712821257426771\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.040598672469526864,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.040598672469526864\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.452991452991453,\n\
\ \"acc_stderr\": 0.0326109987309862,\n \"acc_norm\": 0.452991452991453,\n\
\ \"acc_norm_stderr\": 0.0326109987309862\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40229885057471265,\n\
\ \"acc_stderr\": 0.017535294529068955,\n \"acc_norm\": 0.40229885057471265,\n\
\ \"acc_norm_stderr\": 0.017535294529068955\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.37283236994219654,\n \"acc_stderr\": 0.026033890613576277,\n\
\ \"acc_norm\": 0.37283236994219654,\n \"acc_norm_stderr\": 0.026033890613576277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n\
\ \"acc_stderr\": 0.02791705074848462,\n \"acc_norm\": 0.40836012861736337,\n\
\ \"acc_norm_stderr\": 0.02791705074848462\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.33641975308641975,\n \"acc_stderr\": 0.026289734945952926,\n\
\ \"acc_norm\": 0.33641975308641975,\n \"acc_norm_stderr\": 0.026289734945952926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590624,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590624\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29465449804432853,\n\
\ \"acc_stderr\": 0.011643576764069541,\n \"acc_norm\": 0.29465449804432853,\n\
\ \"acc_norm_stderr\": 0.011643576764069541\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.33986928104575165,\n \"acc_stderr\": 0.01916241858862356,\n \
\ \"acc_norm\": 0.33986928104575165,\n \"acc_norm_stderr\": 0.01916241858862356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\
\ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\
\ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.028666857790274645,\n\
\ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.028666857790274645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4228855721393035,\n\
\ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.4228855721393035,\n\
\ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.0381107966983353,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.0381107966983353\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427686,\n \"mc2\": 0.3353289270087254,\n\
\ \"mc2_stderr\": 0.013074362091466094\n }\n}\n```"
repo_url: https://huggingface.co/Enno-Ai/ennodata-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:21:05.699051.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:21:05.699051.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:21:05.699051.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:21:05.699051.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_21_05.699051
path:
- results_2023-08-17T18:21:05.699051.parquet
- split: latest
path:
- results_2023-08-17T18:21:05.699051.parquet
---
# Dataset Card for Evaluation run of Enno-Ai/ennodata-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Enno-Ai/ennodata-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Enno-Ai/ennodata-7b](https://huggingface.co/Enno-Ai/ennodata-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Enno-Ai__ennodata-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:21:05.699051](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__ennodata-7b/blob/main/results_2023-08-17T18%3A21%3A05.699051.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3456994326510667,
"acc_stderr": 0.03418929248026247,
"acc_norm": 0.34974830348542335,
"acc_norm_stderr": 0.03417639724082498,
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427686,
"mc2": 0.3353289270087254,
"mc2_stderr": 0.013074362091466094
},
"harness|arc:challenge|25": {
"acc": 0.47440273037542663,
"acc_stderr": 0.014592230885298964,
"acc_norm": 0.5102389078498294,
"acc_norm_stderr": 0.014608326906285012
},
"harness|hellaswag|10": {
"acc": 0.5731925911173074,
"acc_stderr": 0.004936029827672036,
"acc_norm": 0.7762397928699463,
"acc_norm_stderr": 0.0041591146798738285
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32452830188679244,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.32452830188679244,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3063583815028902,
"acc_stderr": 0.035149425512674366,
"acc_norm": 0.3063583815028902,
"acc_norm_stderr": 0.035149425512674366
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746304,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746304
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.02241804289111394,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.02241804289111394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3258064516129032,
"acc_stderr": 0.026662010578567104,
"acc_norm": 0.3258064516129032,
"acc_norm_stderr": 0.026662010578567104
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070643,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070643
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39378238341968913,
"acc_stderr": 0.03526077095548237,
"acc_norm": 0.39378238341968913,
"acc_norm_stderr": 0.03526077095548237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886845,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886845
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42935779816513764,
"acc_stderr": 0.021222286397236504,
"acc_norm": 0.42935779816513764,
"acc_norm_stderr": 0.021222286397236504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3284313725490196,
"acc_stderr": 0.03296245110172228,
"acc_norm": 0.3284313725490196,
"acc_norm_stderr": 0.03296245110172228
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4050632911392405,
"acc_stderr": 0.031955147413706725,
"acc_norm": 0.4050632911392405,
"acc_norm_stderr": 0.031955147413706725
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.04142313771996664,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.04142313771996664
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.04562951548180765,
"acc_norm": 0.512396694214876,
"acc_norm_stderr": 0.04562951548180765
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04712821257426771,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04712821257426771
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.040598672469526864,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.040598672469526864
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.452991452991453,
"acc_stderr": 0.0326109987309862,
"acc_norm": 0.452991452991453,
"acc_norm_stderr": 0.0326109987309862
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40229885057471265,
"acc_stderr": 0.017535294529068955,
"acc_norm": 0.40229885057471265,
"acc_norm_stderr": 0.017535294529068955
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.37283236994219654,
"acc_stderr": 0.026033890613576277,
"acc_norm": 0.37283236994219654,
"acc_norm_stderr": 0.026033890613576277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.02791705074848462,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.02791705074848462
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.33641975308641975,
"acc_stderr": 0.026289734945952926,
"acc_norm": 0.33641975308641975,
"acc_norm_stderr": 0.026289734945952926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590624,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590624
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29465449804432853,
"acc_stderr": 0.011643576764069541,
"acc_norm": 0.29465449804432853,
"acc_norm_stderr": 0.011643576764069541
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.33986928104575165,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.33986928104575165,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4228855721393035,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.4228855721393035,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.0381107966983353,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.0381107966983353
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427686,
"mc2": 0.3353289270087254,
"mc2_stderr": 0.013074362091466094
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_acrastt__Marx-3B | 2023-08-27T12:40:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of acrastt/Marx-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [acrastt/Marx-3B](https://huggingface.co/acrastt/Marx-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__Marx-3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T00:59:52.593493](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Marx-3B/blob/main/results_2023-08-18T00%3A59%3A52.593493.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27250732365345814,\n\
\ \"acc_stderr\": 0.03211960436903768,\n \"acc_norm\": 0.2758437762205193,\n\
\ \"acc_norm_stderr\": 0.03211455115116104,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.360192370942906,\n\
\ \"mc2_stderr\": 0.013554650964764735\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38310580204778155,\n \"acc_stderr\": 0.014206472661672881,\n\
\ \"acc_norm\": 0.4052901023890785,\n \"acc_norm_stderr\": 0.014346869060229321\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5326628161720772,\n\
\ \"acc_stderr\": 0.0049791232365079706,\n \"acc_norm\": 0.707329217287393,\n\
\ \"acc_norm_stderr\": 0.00454058698322999\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749884,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749884\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.02834696377716246,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.02834696377716246\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.03664666337225256,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.03664666337225256\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.033333333333333375,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.033333333333333375\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.022755204959542936,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.022755204959542936\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678241,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678241\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511784,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511784\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.02860620428922989,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.02860620428922989\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.021916957709213796,\n\
\ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.021916957709213796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888238,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888238\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823965,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823965\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.02891120880274947,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.02891120880274947\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n\
\ \"acc_stderr\": 0.01620379270319779,\n \"acc_norm\": 0.2886334610472541,\n\
\ \"acc_norm_stderr\": 0.01620379270319779\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.024332146779134124,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.024332146779134124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31511254019292606,\n\
\ \"acc_stderr\": 0.026385273703464496,\n \"acc_norm\": 0.31511254019292606,\n\
\ \"acc_norm_stderr\": 0.026385273703464496\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2993827160493827,\n \"acc_stderr\": 0.02548311560119547,\n\
\ \"acc_norm\": 0.2993827160493827,\n \"acc_norm_stderr\": 0.02548311560119547\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843017,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843017\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2627118644067797,\n\
\ \"acc_stderr\": 0.01124054551499568,\n \"acc_norm\": 0.2627118644067797,\n\
\ \"acc_norm_stderr\": 0.01124054551499568\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.022368672562886754,\n\
\ \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.022368672562886754\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815194,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39183673469387753,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.39183673469387753,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.031417842916639266,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.031417842916639266\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.360192370942906,\n\
\ \"mc2_stderr\": 0.013554650964764735\n }\n}\n```"
repo_url: https://huggingface.co/acrastt/Marx-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|arc:challenge|25_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|arc:challenge|25_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hellaswag|10_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hellaswag|10_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:19:30.468267.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:19:44.606324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:46:31.661460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:59:52.593493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:59:52.593493.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:19:30.468267.parquet'
- split: 2023_08_17T23_19_44.606324
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T23:19:44.606324.parquet'
- split: 2023_08_17T23_46_31.661460
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T23:46:31.661460.parquet'
- split: 2023_08_18T00_59_52.593493
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:59:52.593493.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:59:52.593493.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_19_30.468267
path:
- results_2023-08-17T19:19:30.468267.parquet
- split: 2023_08_17T23_19_44.606324
path:
- results_2023-08-17T23:19:44.606324.parquet
- split: 2023_08_17T23_46_31.661460
path:
- results_2023-08-17T23:46:31.661460.parquet
- split: 2023_08_18T00_59_52.593493
path:
- results_2023-08-18T00:59:52.593493.parquet
- split: latest
path:
- results_2023-08-18T00:59:52.593493.parquet
---
# Dataset Card for Evaluation run of acrastt/Marx-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/acrastt/Marx-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [acrastt/Marx-3B](https://huggingface.co/acrastt/Marx-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_acrastt__Marx-3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T00:59:52.593493](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Marx-3B/blob/main/results_2023-08-18T00%3A59%3A52.593493.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27250732365345814,
"acc_stderr": 0.03211960436903768,
"acc_norm": 0.2758437762205193,
"acc_norm_stderr": 0.03211455115116104,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.360192370942906,
"mc2_stderr": 0.013554650964764735
},
"harness|arc:challenge|25": {
"acc": 0.38310580204778155,
"acc_stderr": 0.014206472661672881,
"acc_norm": 0.4052901023890785,
"acc_norm_stderr": 0.014346869060229321
},
"harness|hellaswag|10": {
"acc": 0.5326628161720772,
"acc_stderr": 0.0049791232365079706,
"acc_norm": 0.707329217287393,
"acc_norm_stderr": 0.00454058698322999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.036906779861372814,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.036906779861372814
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749884,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749884
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.02834696377716246,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.02834696377716246
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.03664666337225256,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.03664666337225256
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.033333333333333375,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.033333333333333375
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2,
"acc_stderr": 0.022755204959542936,
"acc_norm": 0.2,
"acc_norm_stderr": 0.022755204959542936
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678241,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678241
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511784,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511784
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.02860620428922989,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.02860620428922989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.021916957709213796,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.021916957709213796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888238,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823965,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823965
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274947,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274947
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.01620379270319779,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.01620379270319779
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.024332146779134124,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.024332146779134124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31511254019292606,
"acc_stderr": 0.026385273703464496,
"acc_norm": 0.31511254019292606,
"acc_norm_stderr": 0.026385273703464496
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2993827160493827,
"acc_stderr": 0.02548311560119547,
"acc_norm": 0.2993827160493827,
"acc_norm_stderr": 0.02548311560119547
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843017,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843017
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2627118644067797,
"acc_stderr": 0.01124054551499568,
"acc_norm": 0.2627118644067797,
"acc_norm_stderr": 0.01124054551499568
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.022368672562886754,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.022368672562886754
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815194,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39183673469387753,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.39183673469387753,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.031417842916639266,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.031417842916639266
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.360192370942906,
"mc2_stderr": 0.013554650964764735
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EgilKarlsen/Spirit_GPTNEO_Finetuned | 2023-08-23T05:55:17.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 307650065.625
num_examples: 37500
- name: test
num_bytes: 102550020.0
num_examples: 12500
download_size: 565195315
dataset_size: 410200085.625
---
# Dataset Card for "Spirit_GPTNEO_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_acrastt__Griffin-3B | 2023-08-27T12:40:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of acrastt/Griffin-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [acrastt/Griffin-3B](https://huggingface.co/acrastt/Griffin-3B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__Griffin-3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T04:28:39.575079](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Griffin-3B/blob/main/results_2023-08-18T04%3A28%3A39.575079.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27033092758344046,\n\
\ \"acc_stderr\": 0.032067940116846425,\n \"acc_norm\": 0.27402727312364494,\n\
\ \"acc_norm_stderr\": 0.03206319640670025,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.3833047919695716,\n\
\ \"mc2_stderr\": 0.013851464104298106\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38054607508532423,\n \"acc_stderr\": 0.014188277712349814,\n\
\ \"acc_norm\": 0.4180887372013652,\n \"acc_norm_stderr\": 0.014413988396996076\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5424218283210516,\n\
\ \"acc_stderr\": 0.004971789638563323,\n \"acc_norm\": 0.7229635530770763,\n\
\ \"acc_norm_stderr\": 0.004466200055292544\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501715,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669416,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669416\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2161290322580645,\n \"acc_stderr\": 0.02341529343356852,\n \"\
acc_norm\": 0.2161290322580645,\n \"acc_norm_stderr\": 0.02341529343356852\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2315270935960591,\n \"acc_stderr\": 0.02967833314144444,\n \"\
acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.02967833314144444\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463348,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463348\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136088,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24220183486238533,\n\
\ \"acc_stderr\": 0.018368176306598618,\n \"acc_norm\": 0.24220183486238533,\n\
\ \"acc_norm_stderr\": 0.018368176306598618\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767478,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767478\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n\
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4125560538116592,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.4125560538116592,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.30578512396694213,\n \"acc_stderr\": 0.04205953933884124,\n \"\
acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.04205953933884124\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\
\ \"acc_stderr\": 0.016246087069701393,\n \"acc_norm\": 0.29118773946360155,\n\
\ \"acc_norm_stderr\": 0.016246087069701393\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508304,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508304\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961443,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961443\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875202,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875202\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135107,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.01094657096634877,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.01094657096634877\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.024723110407677055,\n\
\ \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.024723110407677055\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174934,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174934\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.3833047919695716,\n\
\ \"mc2_stderr\": 0.013851464104298106\n }\n}\n```"
repo_url: https://huggingface.co/acrastt/Griffin-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|arc:challenge|25_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|arc:challenge|25_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hellaswag|10_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hellaswag|10_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:59:18.128878.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T04:28:39.575079.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T04:28:39.575079.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T03:59:18.128878.parquet'
- split: 2023_08_18T04_28_39.575079
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T04:28:39.575079.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T04:28:39.575079.parquet'
- config_name: results
data_files:
- split: 2023_08_18T03_59_18.128878
path:
- results_2023-08-18T03:59:18.128878.parquet
- split: 2023_08_18T04_28_39.575079
path:
- results_2023-08-18T04:28:39.575079.parquet
- split: latest
path:
- results_2023-08-18T04:28:39.575079.parquet
---
# Dataset Card for Evaluation run of acrastt/Griffin-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/acrastt/Griffin-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [acrastt/Griffin-3B](https://huggingface.co/acrastt/Griffin-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_acrastt__Griffin-3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T04:28:39.575079](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Griffin-3B/blob/main/results_2023-08-18T04%3A28%3A39.575079.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27033092758344046,
"acc_stderr": 0.032067940116846425,
"acc_norm": 0.27402727312364494,
"acc_norm_stderr": 0.03206319640670025,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.3833047919695716,
"mc2_stderr": 0.013851464104298106
},
"harness|arc:challenge|25": {
"acc": 0.38054607508532423,
"acc_stderr": 0.014188277712349814,
"acc_norm": 0.4180887372013652,
"acc_norm_stderr": 0.014413988396996076
},
"harness|hellaswag|10": {
"acc": 0.5424218283210516,
"acc_stderr": 0.004971789638563323,
"acc_norm": 0.7229635530770763,
"acc_norm_stderr": 0.004466200055292544
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.026616482980501715,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.026616482980501715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669416,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669416
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.02967833314144444,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.02967833314144444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463348,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463348
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136088,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767478,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767478
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4125560538116592,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.4125560538116592,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.04205953933884124,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.04205953933884124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29118773946360155,
"acc_stderr": 0.016246087069701393,
"acc_norm": 0.29118773946360155,
"acc_norm_stderr": 0.016246087069701393
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.023786203255508304,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.023786203255508304
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961443,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961443
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875202,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875202
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135107,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.01094657096634877,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.01094657096634877
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.024723110407677055,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.024723110407677055
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.3833047919695716,
"mc2_stderr": 0.013851464104298106
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_acrastt__Puma-3B | 2023-08-27T12:40:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of acrastt/Puma-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [acrastt/Puma-3B](https://huggingface.co/acrastt/Puma-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__Puma-3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T01:05:27.057546](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Puma-3B/blob/main/results_2023-08-18T01%3A05%3A27.057546.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25620413577994805,\n\
\ \"acc_stderr\": 0.031504600023976086,\n \"acc_norm\": 0.2597155865644227,\n\
\ \"acc_norm_stderr\": 0.031501600658633055,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931584,\n \"mc2\": 0.3595915269505455,\n\
\ \"mc2_stderr\": 0.013526241192039292\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35238907849829354,\n \"acc_stderr\": 0.013960142600598687,\n\
\ \"acc_norm\": 0.38310580204778155,\n \"acc_norm_stderr\": 0.014206472661672876\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5266879107747461,\n\
\ \"acc_stderr\": 0.004982668452118946,\n \"acc_norm\": 0.7031467835092611,\n\
\ \"acc_norm_stderr\": 0.004559375835805972\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n\
\ \"acc_stderr\": 0.02960562398177122,\n \"acc_norm\": 0.18497109826589594,\n\
\ \"acc_norm_stderr\": 0.02960562398177122\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880554,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880554\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.041857744240220575,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.041857744240220575\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.03147830790259574,\n\
\ \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.03147830790259574\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242494,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242494\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.19032258064516128,\n \"acc_stderr\": 0.02233170761182307,\n \"\
acc_norm\": 0.19032258064516128,\n \"acc_norm_stderr\": 0.02233170761182307\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n \"acc_norm\"\
: 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790503,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790503\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212802,\n \
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212802\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.1962962962962963,\n \"acc_stderr\": 0.024217421327417145,\n \
\ \"acc_norm\": 0.1962962962962963,\n \"acc_norm_stderr\": 0.024217421327417145\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.018224078117299085,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.018224078117299085\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057986,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057986\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n\
\ \"acc_stderr\": 0.016095302969878555,\n \"acc_norm\": 0.2822477650063857,\n\
\ \"acc_norm_stderr\": 0.016095302969878555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n\
\ \"acc_stderr\": 0.02472386150477169,\n \"acc_norm\": 0.2540192926045016,\n\
\ \"acc_norm_stderr\": 0.02472386150477169\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.22816166883963493,\n\
\ \"acc_stderr\": 0.01071799219204788,\n \"acc_norm\": 0.22816166883963493,\n\
\ \"acc_norm_stderr\": 0.01071799219204788\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.02352924218519311,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.02352924218519311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.029043088683304324,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.029043088683304324\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n\
\ \"acc_stderr\": 0.03187187537919796,\n \"acc_norm\": 0.2835820895522388,\n\
\ \"acc_norm_stderr\": 0.03187187537919796\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824565,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824565\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931584,\n \"mc2\": 0.3595915269505455,\n\
\ \"mc2_stderr\": 0.013526241192039292\n }\n}\n```"
repo_url: https://huggingface.co/acrastt/Puma-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|arc:challenge|25_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|arc:challenge|25_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hellaswag|10_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hellaswag|10_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:20:55.722583.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:25:19.325666.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:52:04.586597.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:05:27.057546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:05:27.057546.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:20:55.722583.parquet'
- split: 2023_08_17T23_25_19.325666
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T23:25:19.325666.parquet'
- split: 2023_08_17T23_52_04.586597
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T23:52:04.586597.parquet'
- split: 2023_08_18T01_05_27.057546
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:05:27.057546.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:05:27.057546.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_20_55.722583
path:
- results_2023-08-17T19:20:55.722583.parquet
- split: 2023_08_17T23_25_19.325666
path:
- results_2023-08-17T23:25:19.325666.parquet
- split: 2023_08_17T23_52_04.586597
path:
- results_2023-08-17T23:52:04.586597.parquet
- split: 2023_08_18T01_05_27.057546
path:
- results_2023-08-18T01:05:27.057546.parquet
- split: latest
path:
- results_2023-08-18T01:05:27.057546.parquet
---
# Dataset Card for Evaluation run of acrastt/Puma-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/acrastt/Puma-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [acrastt/Puma-3B](https://huggingface.co/acrastt/Puma-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_acrastt__Puma-3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T01:05:27.057546](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__Puma-3B/blob/main/results_2023-08-18T01%3A05%3A27.057546.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25620413577994805,
"acc_stderr": 0.031504600023976086,
"acc_norm": 0.2597155865644227,
"acc_norm_stderr": 0.031501600658633055,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931584,
"mc2": 0.3595915269505455,
"mc2_stderr": 0.013526241192039292
},
"harness|arc:challenge|25": {
"acc": 0.35238907849829354,
"acc_stderr": 0.013960142600598687,
"acc_norm": 0.38310580204778155,
"acc_norm_stderr": 0.014206472661672876
},
"harness|hellaswag|10": {
"acc": 0.5266879107747461,
"acc_stderr": 0.004982668452118946,
"acc_norm": 0.7031467835092611,
"acc_norm_stderr": 0.004559375835805972
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106136,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177122,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177122
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880554,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880554
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220575,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220575
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.03147830790259574,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.03147830790259574
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242494,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242494
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790503,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790503
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212802,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212802
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.1962962962962963,
"acc_stderr": 0.024217421327417145,
"acc_norm": 0.1962962962962963,
"acc_norm_stderr": 0.024217421327417145
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.018224078117299085,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.018224078117299085
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.027467401804057986,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.027467401804057986
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878555,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.02472386150477169,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.02472386150477169
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.22816166883963493,
"acc_stderr": 0.01071799219204788,
"acc_norm": 0.22816166883963493,
"acc_norm_stderr": 0.01071799219204788
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.02352924218519311,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.02352924218519311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.029043088683304324,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.029043088683304324
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.03187187537919796,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.03187187537919796
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931584,
"mc2": 0.3595915269505455,
"mc2_stderr": 0.013526241192039292
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CalderaAI__30B-Epsilon | 2023-09-23T06:45:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CalderaAI/30B-Epsilon
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CalderaAI/30B-Epsilon](https://huggingface.co/CalderaAI/30B-Epsilon) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CalderaAI__30B-Epsilon\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T06:45:40.292570](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__30B-Epsilon/blob/main/results_2023-09-23T06-45-40.292570.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2751677852348993,\n\
\ \"em_stderr\": 0.004573589572048243,\n \"f1\": 0.35816799496644447,\n\
\ \"f1_stderr\": 0.004490769156956796,\n \"acc\": 0.4417684464744225,\n\
\ \"acc_stderr\": 0.010108340065362846\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2751677852348993,\n \"em_stderr\": 0.004573589572048243,\n\
\ \"f1\": 0.35816799496644447,\n \"f1_stderr\": 0.004490769156956796\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \
\ \"acc_stderr\": 0.008510982565520488\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205203\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CalderaAI/30B-Epsilon
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_09T14_24_13.994751
path:
- '**/details_harness|drop|3_2023-09-09T14-24-13.994751.parquet'
- split: 2023_09_23T06_45_40.292570
path:
- '**/details_harness|drop|3_2023-09-23T06-45-40.292570.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T06-45-40.292570.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_09T14_24_13.994751
path:
- '**/details_harness|gsm8k|5_2023-09-09T14-24-13.994751.parquet'
- split: 2023_09_23T06_45_40.292570
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-45-40.292570.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-45-40.292570.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:47:15.382915.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:47:15.382915.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_09T14_24_13.994751
path:
- '**/details_harness|winogrande|5_2023-09-09T14-24-13.994751.parquet'
- split: 2023_09_23T06_45_40.292570
path:
- '**/details_harness|winogrande|5_2023-09-23T06-45-40.292570.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T06-45-40.292570.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_47_15.382915
path:
- results_2023-08-17T19:47:15.382915.parquet
- split: 2023_09_09T14_24_13.994751
path:
- results_2023-09-09T14-24-13.994751.parquet
- split: 2023_09_23T06_45_40.292570
path:
- results_2023-09-23T06-45-40.292570.parquet
- split: latest
path:
- results_2023-09-23T06-45-40.292570.parquet
---
# Dataset Card for Evaluation run of CalderaAI/30B-Epsilon
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CalderaAI/30B-Epsilon
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CalderaAI/30B-Epsilon](https://huggingface.co/CalderaAI/30B-Epsilon) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CalderaAI__30B-Epsilon",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T06:45:40.292570](https://huggingface.co/datasets/open-llm-leaderboard/details_CalderaAI__30B-Epsilon/blob/main/results_2023-09-23T06-45-40.292570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2751677852348993,
"em_stderr": 0.004573589572048243,
"f1": 0.35816799496644447,
"f1_stderr": 0.004490769156956796,
"acc": 0.4417684464744225,
"acc_stderr": 0.010108340065362846
},
"harness|drop|3": {
"em": 0.2751677852348993,
"em_stderr": 0.004573589572048243,
"f1": 0.35816799496644447,
"f1_stderr": 0.004490769156956796
},
"harness|gsm8k|5": {
"acc": 0.1068991660348749,
"acc_stderr": 0.008510982565520488
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205203
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-orca-chat-10k | 2023-08-27T12:40:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of quantumaikr/llama-2-70b-fb16-orca-chat-10k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/llama-2-70b-fb16-orca-chat-10k](https://huggingface.co/quantumaikr/llama-2-70b-fb16-orca-chat-10k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-orca-chat-10k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T21:37:12.844888](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-orca-chat-10k/blob/main/results_2023-08-17T21%3A37%3A12.844888.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6906582818144935,\n\
\ \"acc_stderr\": 0.03127946887219238,\n \"acc_norm\": 0.6949758655540409,\n\
\ \"acc_norm_stderr\": 0.03124780295608181,\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.01737452048251371,\n \"mc2\": 0.6155546385905364,\n\
\ \"mc2_stderr\": 0.014689914393957965\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407163,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173307\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6731726747659829,\n\
\ \"acc_stderr\": 0.004680949283855315,\n \"acc_norm\": 0.8707428799044015,\n\
\ \"acc_norm_stderr\": 0.003347986669565309\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382175,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.022037217340267822,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.022037217340267822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284367,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284367\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822523,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.0228158130988966,\n \
\ \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.0228158130988966\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603396,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603396\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8899082568807339,\n\
\ \"acc_stderr\": 0.0134199390186812,\n \"acc_norm\": 0.8899082568807339,\n\
\ \"acc_norm_stderr\": 0.0134199390186812\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n\
\ \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8823529411764706,\n \"acc_stderr\": 0.02261328660113202,\n \"\
acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.02261328660113202\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.02068174513588457,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.02068174513588457\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.0291998024556228,\n \"acc_norm\"\
: 0.8842975206611571,\n \"acc_norm_stderr\": 0.0291998024556228\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941635,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941635\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n\
\ \"acc_stderr\": 0.012282876868629234,\n \"acc_norm\": 0.8633461047254151,\n\
\ \"acc_norm_stderr\": 0.012282876868629234\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5631284916201117,\n\
\ \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.5631284916201117,\n\
\ \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.020423955354778034,\n\
\ \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.020423955354778034\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5354609929078015,\n \"acc_stderr\": 0.029752389657427058,\n \
\ \"acc_norm\": 0.5354609929078015,\n \"acc_norm_stderr\": 0.029752389657427058\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5378096479791395,\n\
\ \"acc_stderr\": 0.012733671880342504,\n \"acc_norm\": 0.5378096479791395,\n\
\ \"acc_norm_stderr\": 0.012733671880342504\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233815,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233815\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.01737452048251371,\n \"mc2\": 0.6155546385905364,\n\
\ \"mc2_stderr\": 0.014689914393957965\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/llama-2-70b-fb16-orca-chat-10k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|arc:challenge|25_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hellaswag|10_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:37:12.844888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:37:12.844888.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T21:37:12.844888.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T21:37:12.844888.parquet'
- config_name: results
data_files:
- split: 2023_08_17T21_37_12.844888
path:
- results_2023-08-17T21:37:12.844888.parquet
- split: latest
path:
- results_2023-08-17T21:37:12.844888.parquet
---
# Dataset Card for Evaluation run of quantumaikr/llama-2-70b-fb16-orca-chat-10k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/llama-2-70b-fb16-orca-chat-10k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-70b-fb16-orca-chat-10k](https://huggingface.co/quantumaikr/llama-2-70b-fb16-orca-chat-10k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-orca-chat-10k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T21:37:12.844888](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70b-fb16-orca-chat-10k/blob/main/results_2023-08-17T21%3A37%3A12.844888.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6906582818144935,
"acc_stderr": 0.03127946887219238,
"acc_norm": 0.6949758655540409,
"acc_norm_stderr": 0.03124780295608181,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.01737452048251371,
"mc2": 0.6155546385905364,
"mc2_stderr": 0.014689914393957965
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407163,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173307
},
"harness|hellaswag|10": {
"acc": 0.6731726747659829,
"acc_stderr": 0.004680949283855315,
"acc_norm": 0.8707428799044015,
"acc_norm_stderr": 0.003347986669565309
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382175,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267822,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284367,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284367
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822523,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.0228158130988966,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.0228158130988966
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603396,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.0134199390186812,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.0134199390186812
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.02261328660113202,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.02261328660113202
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.02068174513588457,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.02068174513588457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.0291998024556228,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.0291998024556228
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941635,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941635
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8633461047254151,
"acc_stderr": 0.012282876868629234,
"acc_norm": 0.8633461047254151,
"acc_norm_stderr": 0.012282876868629234
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5631284916201117,
"acc_stderr": 0.01658868086453063,
"acc_norm": 0.5631284916201117,
"acc_norm_stderr": 0.01658868086453063
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.020423955354778034,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.020423955354778034
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5354609929078015,
"acc_stderr": 0.029752389657427058,
"acc_norm": 0.5354609929078015,
"acc_norm_stderr": 0.029752389657427058
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5378096479791395,
"acc_stderr": 0.012733671880342504,
"acc_norm": 0.5378096479791395,
"acc_norm_stderr": 0.012733671880342504
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233815,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233815
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.01737452048251371,
"mc2": 0.6155546385905364,
"mc2_stderr": 0.014689914393957965
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_sia-ai__llama-2-7b-1-percent-open-orca-1000-steps-v0 | 2023-08-27T12:40:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0](https://huggingface.co/sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sia-ai__llama-2-7b-1-percent-open-orca-1000-steps-v0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T13:55:57.511446](https://huggingface.co/datasets/open-llm-leaderboard/details_sia-ai__llama-2-7b-1-percent-open-orca-1000-steps-v0/blob/main/results_2023-08-18T13%3A55%3A57.511446.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44963038289753166,\n\
\ \"acc_stderr\": 0.035105637297221146,\n \"acc_norm\": 0.45372191023144215,\n\
\ \"acc_norm_stderr\": 0.035091765596554185,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.45831334808971286,\n\
\ \"mc2_stderr\": 0.014734497939378888\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4735494880546075,\n \"acc_stderr\": 0.014590931358120172,\n\
\ \"acc_norm\": 0.5127986348122867,\n \"acc_norm_stderr\": 0.014606603181012541\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5853415654252141,\n\
\ \"acc_stderr\": 0.004916561213591285,\n \"acc_norm\": 0.7874925313682534,\n\
\ \"acc_norm_stderr\": 0.004082459051347828\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.03063562795796182,\n\
\ \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.03063562795796182\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.13725490196078433,\n \"acc_stderr\": 0.03424084669891521,\n\
\ \"acc_norm\": 0.13725490196078433,\n \"acc_norm_stderr\": 0.03424084669891521\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489364,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489364\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194978,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194978\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n\
\ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.03559443565563919,\n \"\
acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.03559443565563919\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.03340361906276586,\n\
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.03340361906276586\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694838,\n\
\ \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694838\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5504587155963303,\n \"acc_stderr\": 0.021327881417823363,\n \"\
acc_norm\": 0.5504587155963303,\n \"acc_norm_stderr\": 0.021327881417823363\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5980392156862745,\n\
\ \"acc_stderr\": 0.03441190023482465,\n \"acc_norm\": 0.5980392156862745,\n\
\ \"acc_norm_stderr\": 0.03441190023482465\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6413502109704642,\n \"acc_stderr\": 0.031219569445301843,\n\
\ \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.031219569445301843\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.04385162325601553,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.04385162325601553\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.03924746876751129,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.03924746876751129\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.030463656747340268,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.030463656747340268\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6130268199233716,\n\
\ \"acc_stderr\": 0.017417138059440136,\n \"acc_norm\": 0.6130268199233716,\n\
\ \"acc_norm_stderr\": 0.017417138059440136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n\
\ \"acc_stderr\": 0.015183844307206158,\n \"acc_norm\": 0.2905027932960894,\n\
\ \"acc_norm_stderr\": 0.015183844307206158\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5209003215434084,\n\
\ \"acc_stderr\": 0.028373270961069414,\n \"acc_norm\": 0.5209003215434084,\n\
\ \"acc_norm_stderr\": 0.028373270961069414\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323674,\n\
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35658409387222945,\n\
\ \"acc_stderr\": 0.012233642989273891,\n \"acc_norm\": 0.35658409387222945,\n\
\ \"acc_norm_stderr\": 0.012233642989273891\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43300653594771243,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.43300653594771243,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.44081632653061226,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.44081632653061226,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5621890547263682,\n\
\ \"acc_stderr\": 0.035080801121998406,\n \"acc_norm\": 0.5621890547263682,\n\
\ \"acc_norm_stderr\": 0.035080801121998406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.45831334808971286,\n\
\ \"mc2_stderr\": 0.014734497939378888\n }\n}\n```"
repo_url: https://huggingface.co/sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|arc:challenge|25_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hellaswag|10_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:55:57.511446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:55:57.511446.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T13:55:57.511446.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T13:55:57.511446.parquet'
- config_name: results
data_files:
- split: 2023_08_18T13_55_57.511446
path:
- results_2023-08-18T13:55:57.511446.parquet
- split: latest
path:
- results_2023-08-18T13:55:57.511446.parquet
---
# Dataset Card for Evaluation run of sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0](https://huggingface.co/sia-ai/llama-2-7b-1-percent-open-orca-1000-steps-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sia-ai__llama-2-7b-1-percent-open-orca-1000-steps-v0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T13:55:57.511446](https://huggingface.co/datasets/open-llm-leaderboard/details_sia-ai__llama-2-7b-1-percent-open-orca-1000-steps-v0/blob/main/results_2023-08-18T13%3A55%3A57.511446.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44963038289753166,
"acc_stderr": 0.035105637297221146,
"acc_norm": 0.45372191023144215,
"acc_norm_stderr": 0.035091765596554185,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.45831334808971286,
"mc2_stderr": 0.014734497939378888
},
"harness|arc:challenge|25": {
"acc": 0.4735494880546075,
"acc_stderr": 0.014590931358120172,
"acc_norm": 0.5127986348122867,
"acc_norm_stderr": 0.014606603181012541
},
"harness|hellaswag|10": {
"acc": 0.5853415654252141,
"acc_stderr": 0.004916561213591285,
"acc_norm": 0.7874925313682534,
"acc_norm_stderr": 0.004082459051347828
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.03063562795796182,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.03063562795796182
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.13725490196078433,
"acc_stderr": 0.03424084669891521,
"acc_norm": 0.13725490196078433,
"acc_norm_stderr": 0.03424084669891521
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489364,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489364
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194978,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194978
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.038881769216741004,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.038881769216741004
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.03559443565563919,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.03559443565563919
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.03340361906276586,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.03340361906276586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694838,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5504587155963303,
"acc_stderr": 0.021327881417823363,
"acc_norm": 0.5504587155963303,
"acc_norm_stderr": 0.021327881417823363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.03441190023482465,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.03441190023482465
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6413502109704642,
"acc_stderr": 0.031219569445301843,
"acc_norm": 0.6413502109704642,
"acc_norm_stderr": 0.031219569445301843
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.04385162325601553,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.04385162325601553
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.03924746876751129,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.03924746876751129
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.030463656747340268,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.030463656747340268
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6130268199233716,
"acc_stderr": 0.017417138059440136,
"acc_norm": 0.6130268199233716,
"acc_norm_stderr": 0.017417138059440136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.015183844307206158,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.015183844307206158
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5209003215434084,
"acc_stderr": 0.028373270961069414,
"acc_norm": 0.5209003215434084,
"acc_norm_stderr": 0.028373270961069414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.027801656212323674,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.027801656212323674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35658409387222945,
"acc_stderr": 0.012233642989273891,
"acc_norm": 0.35658409387222945,
"acc_norm_stderr": 0.012233642989273891
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43300653594771243,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.43300653594771243,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.44081632653061226,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.44081632653061226,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5621890547263682,
"acc_stderr": 0.035080801121998406,
"acc_norm": 0.5621890547263682,
"acc_norm_stderr": 0.035080801121998406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.45831334808971286,
"mc2_stderr": 0.014734497939378888
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_circulus__Llama-2-13b-orca-v1 | 2023-09-17T12:51:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of circulus/Llama-2-13b-orca-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [circulus/Llama-2-13b-orca-v1](https://huggingface.co/circulus/Llama-2-13b-orca-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_circulus__Llama-2-13b-orca-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T12:50:57.881579](https://huggingface.co/datasets/open-llm-leaderboard/details_circulus__Llama-2-13b-orca-v1/blob/main/results_2023-09-17T12-50-57.881579.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1529991610738255,\n\
\ \"em_stderr\": 0.0036866006582882706,\n \"f1\": 0.2246581375838923,\n\
\ \"f1_stderr\": 0.003770616290655452,\n \"acc\": 0.44842066021890015,\n\
\ \"acc_stderr\": 0.010546865226614108\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1529991610738255,\n \"em_stderr\": 0.0036866006582882706,\n\
\ \"f1\": 0.2246581375838923,\n \"f1_stderr\": 0.003770616290655452\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1288855193328279,\n \
\ \"acc_stderr\": 0.009229580761400274\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827943\n\
\ }\n}\n```"
repo_url: https://huggingface.co/circulus/Llama-2-13b-orca-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|arc:challenge|25_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T07_53_18.900339
path:
- '**/details_harness|drop|3_2023-09-17T07-53-18.900339.parquet'
- split: 2023_09_17T12_50_57.881579
path:
- '**/details_harness|drop|3_2023-09-17T12-50-57.881579.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T12-50-57.881579.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T07_53_18.900339
path:
- '**/details_harness|gsm8k|5_2023-09-17T07-53-18.900339.parquet'
- split: 2023_09_17T12_50_57.881579
path:
- '**/details_harness|gsm8k|5_2023-09-17T12-50-57.881579.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T12-50-57.881579.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hellaswag|10_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:46:04.009114.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T08:46:04.009114.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T08:46:04.009114.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T07_53_18.900339
path:
- '**/details_harness|winogrande|5_2023-09-17T07-53-18.900339.parquet'
- split: 2023_09_17T12_50_57.881579
path:
- '**/details_harness|winogrande|5_2023-09-17T12-50-57.881579.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T12-50-57.881579.parquet'
- config_name: results
data_files:
- split: 2023_08_18T08_46_04.009114
path:
- results_2023-08-18T08:46:04.009114.parquet
- split: 2023_09_17T07_53_18.900339
path:
- results_2023-09-17T07-53-18.900339.parquet
- split: 2023_09_17T12_50_57.881579
path:
- results_2023-09-17T12-50-57.881579.parquet
- split: latest
path:
- results_2023-09-17T12-50-57.881579.parquet
---
# Dataset Card for Evaluation run of circulus/Llama-2-13b-orca-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/circulus/Llama-2-13b-orca-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [circulus/Llama-2-13b-orca-v1](https://huggingface.co/circulus/Llama-2-13b-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_circulus__Llama-2-13b-orca-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T12:50:57.881579](https://huggingface.co/datasets/open-llm-leaderboard/details_circulus__Llama-2-13b-orca-v1/blob/main/results_2023-09-17T12-50-57.881579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1529991610738255,
"em_stderr": 0.0036866006582882706,
"f1": 0.2246581375838923,
"f1_stderr": 0.003770616290655452,
"acc": 0.44842066021890015,
"acc_stderr": 0.010546865226614108
},
"harness|drop|3": {
"em": 0.1529991610738255,
"em_stderr": 0.0036866006582882706,
"f1": 0.2246581375838923,
"f1_stderr": 0.003770616290655452
},
"harness|gsm8k|5": {
"acc": 0.1288855193328279,
"acc_stderr": 0.009229580761400274
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827943
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B | 2023-09-23T09:15:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of garage-bAInd/Camel-Platypus2-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [garage-bAInd/Camel-Platypus2-70B](https://huggingface.co/garage-bAInd/Camel-Platypus2-70B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T09:15:03.498663](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B/blob/main/results_2023-09-23T09-15-03.498663.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5068162751677853,\n\
\ \"em_stderr\": 0.005119992158743519,\n \"f1\": 0.5610287332214777,\n\
\ \"f1_stderr\": 0.004821120410845756,\n \"acc\": 0.5335809039518948,\n\
\ \"acc_stderr\": 0.010961770451355313\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5068162751677853,\n \"em_stderr\": 0.005119992158743519,\n\
\ \"f1\": 0.5610287332214777,\n \"f1_stderr\": 0.004821120410845756\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22896133434420016,\n \
\ \"acc_stderr\": 0.011573412892418223\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n\
\ }\n}\n```"
repo_url: https://huggingface.co/garage-bAInd/Camel-Platypus2-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|drop|3_2023-09-23T09-15-03.498663.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T09-15-03.498663.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|gsm8k|5_2023-09-23T09-15-03.498663.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T09-15-03.498663.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|winogrande|5_2023-09-23T09-15-03.498663.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T09-15-03.498663.parquet'
- config_name: results
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- results_2023-08-18T00:04:49.359575.parquet
- split: 2023_09_23T09_15_03.498663
path:
- results_2023-09-23T09-15-03.498663.parquet
- split: latest
path:
- results_2023-09-23T09-15-03.498663.parquet
---
# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/garage-bAInd/Camel-Platypus2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [garage-bAInd/Camel-Platypus2-70B](https://huggingface.co/garage-bAInd/Camel-Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T09:15:03.498663](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B/blob/main/results_2023-09-23T09-15-03.498663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5068162751677853,
"em_stderr": 0.005119992158743519,
"f1": 0.5610287332214777,
"f1_stderr": 0.004821120410845756,
"acc": 0.5335809039518948,
"acc_stderr": 0.010961770451355313
},
"harness|drop|3": {
"em": 0.5068162751677853,
"em_stderr": 0.005119992158743519,
"f1": 0.5610287332214777,
"f1_stderr": 0.004821120410845756
},
"harness|gsm8k|5": {
"acc": 0.22896133434420016,
"acc_stderr": 0.011573412892418223
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292406
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Fredithefish__ReasonixPajama-3B-HF | 2023-08-27T12:40:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Fredithefish/ReasonixPajama-3B-HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/ReasonixPajama-3B-HF](https://huggingface.co/Fredithefish/ReasonixPajama-3B-HF)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__ReasonixPajama-3B-HF\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T15:18:48.992858](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__ReasonixPajama-3B-HF/blob/main/results_2023-08-17T15%3A18%3A48.992858.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26615294293130287,\n\
\ \"acc_stderr\": 0.03194662562788747,\n \"acc_norm\": 0.26945529622096925,\n\
\ \"acc_norm_stderr\": 0.031948563089195504,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.554154975832954,\n\
\ \"mc2_stderr\": 0.014463174363079511\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35409556313993173,\n \"acc_stderr\": 0.01397545412275656,\n\
\ \"acc_norm\": 0.3924914675767918,\n \"acc_norm_stderr\": 0.014269634635670714\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47829117705636326,\n\
\ \"acc_stderr\": 0.004985076094464756,\n \"acc_norm\": 0.6347341167098187,\n\
\ \"acc_norm_stderr\": 0.0048052057987245725\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343601,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343601\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708097,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708097\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388978,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388978\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20105820105820105,\n \"acc_stderr\": 0.020641810782370158,\n \"\
acc_norm\": 0.20105820105820105,\n \"acc_norm_stderr\": 0.020641810782370158\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624337,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624337\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732523,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.032396370467357015,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.032396370467357015\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.29743589743589743,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.29743589743589743,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766118,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766118\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882374,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882374\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26605504587155965,\n\
\ \"acc_stderr\": 0.01894602232222559,\n \"acc_norm\": 0.26605504587155965,\n\
\ \"acc_norm_stderr\": 0.01894602232222559\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353604,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353604\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3137254901960784,\n \"acc_stderr\": 0.03256685484460389,\n \"\
acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.03256685484460389\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3206751054852321,\n \"acc_stderr\": 0.030381931949990407,\n \
\ \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.030381931949990407\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n\
\ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.2556053811659193,\n\
\ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.02777883590493543,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.02777883590493543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455779,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455779\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410616,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410616\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042114,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.01091640673547895,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.01091640673547895\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.02909720956841196,\n\
\ \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.02909720956841196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2816326530612245,\n \"acc_stderr\": 0.02879518557429128,\n\
\ \"acc_norm\": 0.2816326530612245,\n \"acc_norm_stderr\": 0.02879518557429128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355547,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355547\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.03070982405056528,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.03070982405056528\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824562,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824562\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.554154975832954,\n\
\ \"mc2_stderr\": 0.014463174363079511\n }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/ReasonixPajama-3B-HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:18:48.992858.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:18:48.992858.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:18:48.992858.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:18:48.992858.parquet'
- config_name: results
data_files:
- split: 2023_08_17T15_18_48.992858
path:
- results_2023-08-17T15:18:48.992858.parquet
- split: latest
path:
- results_2023-08-17T15:18:48.992858.parquet
---
# Dataset Card for Evaluation run of Fredithefish/ReasonixPajama-3B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/ReasonixPajama-3B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/ReasonixPajama-3B-HF](https://huggingface.co/Fredithefish/ReasonixPajama-3B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__ReasonixPajama-3B-HF",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T15:18:48.992858](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__ReasonixPajama-3B-HF/blob/main/results_2023-08-17T15%3A18%3A48.992858.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26615294293130287,
"acc_stderr": 0.03194662562788747,
"acc_norm": 0.26945529622096925,
"acc_norm_stderr": 0.031948563089195504,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.554154975832954,
"mc2_stderr": 0.014463174363079511
},
"harness|arc:challenge|25": {
"acc": 0.35409556313993173,
"acc_stderr": 0.01397545412275656,
"acc_norm": 0.3924914675767918,
"acc_norm_stderr": 0.014269634635670714
},
"harness|hellaswag|10": {
"acc": 0.47829117705636326,
"acc_stderr": 0.004985076094464756,
"acc_norm": 0.6347341167098187,
"acc_norm_stderr": 0.0048052057987245725
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343601,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343601
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708097,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708097
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388978,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388978
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20105820105820105,
"acc_stderr": 0.020641810782370158,
"acc_norm": 0.20105820105820105,
"acc_norm_stderr": 0.020641810782370158
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624337,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624337
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732523,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.032396370467357015,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.032396370467357015
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.29743589743589743,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.29743589743589743,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766118,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766118
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882374,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882374
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26605504587155965,
"acc_stderr": 0.01894602232222559,
"acc_norm": 0.26605504587155965,
"acc_norm_stderr": 0.01894602232222559
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353604,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353604
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.03256685484460389,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.03256685484460389
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3206751054852321,
"acc_stderr": 0.030381931949990407,
"acc_norm": 0.3206751054852321,
"acc_norm_stderr": 0.030381931949990407
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2556053811659193,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.2556053811659193,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.02777883590493543,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.02777883590493543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455779,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455779
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410616,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410616
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042114,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.01091640673547895,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.01091640673547895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.02909720956841196,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.02909720956841196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2816326530612245,
"acc_stderr": 0.02879518557429128,
"acc_norm": 0.2816326530612245,
"acc_norm_stderr": 0.02879518557429128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355547,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355547
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.03070982405056528,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.03070982405056528
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824562,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824562
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.554154975832954,
"mc2_stderr": 0.014463174363079511
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Panchovix__airoboros-33b-gpt4-1.2-SuperHOT-8k | 2023-09-17T16:57:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k](https://huggingface.co/Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Panchovix__airoboros-33b-gpt4-1.2-SuperHOT-8k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T16:57:27.002060](https://huggingface.co/datasets/open-llm-leaderboard/details_Panchovix__airoboros-33b-gpt4-1.2-SuperHOT-8k/blob/main/results_2023-09-17T16-57-27.002060.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642965811,\n \"f1\": 0.005930159395973156,\n\
\ \"f1_stderr\": 0.0006950327104148204,\n \"acc\": 0.2521704814522494,\n\
\ \"acc_stderr\": 0.007025978032038446\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965811,\n\
\ \"f1\": 0.005930159395973156,\n \"f1_stderr\": 0.0006950327104148204\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n\
\ \"acc_stderr\": 0.014051956064076892\n }\n}\n```"
repo_url: https://huggingface.co/Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|arc:challenge|25_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T16_57_27.002060
path:
- '**/details_harness|drop|3_2023-09-17T16-57-27.002060.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T16-57-27.002060.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T16_57_27.002060
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-57-27.002060.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T16-57-27.002060.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hellaswag|10_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T20:41:42.341199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T20:41:42.341199.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T20:41:42.341199.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T16_57_27.002060
path:
- '**/details_harness|winogrande|5_2023-09-17T16-57-27.002060.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T16-57-27.002060.parquet'
- config_name: results
data_files:
- split: 2023_08_17T20_41_42.341199
path:
- results_2023-08-17T20:41:42.341199.parquet
- split: 2023_09_17T16_57_27.002060
path:
- results_2023-09-17T16-57-27.002060.parquet
- split: latest
path:
- results_2023-09-17T16-57-27.002060.parquet
---
# Dataset Card for Evaluation run of Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k](https://huggingface.co/Panchovix/airoboros-33b-gpt4-1.2-SuperHOT-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Panchovix__airoboros-33b-gpt4-1.2-SuperHOT-8k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T16:57:27.002060](https://huggingface.co/datasets/open-llm-leaderboard/details_Panchovix__airoboros-33b-gpt4-1.2-SuperHOT-8k/blob/main/results_2023-09-17T16-57-27.002060.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965811,
"f1": 0.005930159395973156,
"f1_stderr": 0.0006950327104148204,
"acc": 0.2521704814522494,
"acc_stderr": 0.007025978032038446
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965811,
"f1": 0.005930159395973156,
"f1_stderr": 0.0006950327104148204
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076892
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla | 2023-09-17T08:49:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of l3utterfly/open-llama-3b-v2-layla
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [l3utterfly/open-llama-3b-v2-layla](https://huggingface.co/l3utterfly/open-llama-3b-v2-layla)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T08:49:03.131155](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla/blob/main/results_2023-09-17T08-49-03.131155.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.011954697986577181,\n\
\ \"em_stderr\": 0.0011130056898859086,\n \"f1\": 0.07875629194630916,\n\
\ \"f1_stderr\": 0.0018920865515620476,\n \"acc\": 0.3194349118852447,\n\
\ \"acc_stderr\": 0.008202509803690292\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.011954697986577181,\n \"em_stderr\": 0.0011130056898859086,\n\
\ \"f1\": 0.07875629194630916,\n \"f1_stderr\": 0.0018920865515620476\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \
\ \"acc_stderr\": 0.0028227133223877035\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6282557221783741,\n \"acc_stderr\": 0.013582306284992879\n\
\ }\n}\n```"
repo_url: https://huggingface.co/l3utterfly/open-llama-3b-v2-layla
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T08_49_03.131155
path:
- '**/details_harness|drop|3_2023-09-17T08-49-03.131155.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T08-49-03.131155.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T08_49_03.131155
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-49-03.131155.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-49-03.131155.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T08_49_03.131155
path:
- '**/details_harness|winogrande|5_2023-09-17T08-49-03.131155.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T08-49-03.131155.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- results_2023-08-18T14:37:31.844402.parquet
- split: 2023_09_17T08_49_03.131155
path:
- results_2023-09-17T08-49-03.131155.parquet
- split: latest
path:
- results_2023-09-17T08-49-03.131155.parquet
---
# Dataset Card for Evaluation run of l3utterfly/open-llama-3b-v2-layla
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/l3utterfly/open-llama-3b-v2-layla
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [l3utterfly/open-llama-3b-v2-layla](https://huggingface.co/l3utterfly/open-llama-3b-v2-layla) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:49:03.131155](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla/blob/main/results_2023-09-17T08-49-03.131155.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.011954697986577181,
"em_stderr": 0.0011130056898859086,
"f1": 0.07875629194630916,
"f1_stderr": 0.0018920865515620476,
"acc": 0.3194349118852447,
"acc_stderr": 0.008202509803690292
},
"harness|drop|3": {
"em": 0.011954697986577181,
"em_stderr": 0.0011130056898859086,
"f1": 0.07875629194630916,
"f1_stderr": 0.0018920865515620476
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.0028227133223877035
},
"harness|winogrande|5": {
"acc": 0.6282557221783741,
"acc_stderr": 0.013582306284992879
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_psmathur__model_101 | 2023-08-27T12:40:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of psmathur/model_101
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/model_101](https://huggingface.co/psmathur/model_101) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_101\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T01:38:15.380196](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_101/blob/main/results_2023-08-18T01%3A38%3A15.380196.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6979243699793755,\n\
\ \"acc_stderr\": 0.031179484382710294,\n \"acc_norm\": 0.7017586189998054,\n\
\ \"acc_norm_stderr\": 0.031151369790785297,\n \"mc1\": 0.423500611995104,\n\
\ \"mc1_stderr\": 0.017297421448534734,\n \"mc2\": 0.588506570085178,\n\
\ \"mc2_stderr\": 0.015040018908256268\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n\
\ \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623496\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.67805218084047,\n \
\ \"acc_stderr\": 0.0046626822330937834,\n \"acc_norm\": 0.8641704839673372,\n\
\ \"acc_norm_stderr\": 0.003419072480735362\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741706,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741706\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.033096151770590075,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.033096151770590075\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.03068302084323101,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.03068302084323101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.025680564640056882,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.025680564640056882\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.021417242936321593,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.021417242936321593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0301176889295036,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0301176889295036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822523,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8862385321100917,\n \"acc_stderr\": 0.013613614800232796,\n \"\
acc_norm\": 0.8862385321100917,\n \"acc_norm_stderr\": 0.013613614800232796\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.03019482399680448,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.03019482399680448\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n\
\ \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n\
\ \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5553072625698324,\n\
\ \"acc_stderr\": 0.016619881988177015,\n \"acc_norm\": 0.5553072625698324,\n\
\ \"acc_norm_stderr\": 0.016619881988177015\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.02103851777015737,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.02103851777015737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5482398956975228,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.5482398956975228,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041496,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041496\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7598039215686274,\n \"acc_stderr\": 0.017282760695167404,\n \
\ \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.017282760695167404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.04069306319721374,\n \"acc_norm\": 0.7636363636363637,\n\
\ \"acc_norm_stderr\": 0.04069306319721374\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.025801283475090496,\n\
\ \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.025801283475090496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160875,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160875\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.423500611995104,\n\
\ \"mc1_stderr\": 0.017297421448534734,\n \"mc2\": 0.588506570085178,\n\
\ \"mc2_stderr\": 0.015040018908256268\n }\n}\n```"
repo_url: https://huggingface.co/psmathur/model_101
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:38:15.380196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:38:15.380196.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:38:15.380196.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:38:15.380196.parquet'
- config_name: results
data_files:
- split: 2023_08_18T01_38_15.380196
path:
- results_2023-08-18T01:38:15.380196.parquet
- split: latest
path:
- results_2023-08-18T01:38:15.380196.parquet
---
# Dataset Card for Evaluation run of psmathur/model_101
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_101
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_101](https://huggingface.co/psmathur/model_101) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_101",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T01:38:15.380196](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_101/blob/main/results_2023-08-18T01%3A38%3A15.380196.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6979243699793755,
"acc_stderr": 0.031179484382710294,
"acc_norm": 0.7017586189998054,
"acc_norm_stderr": 0.031151369790785297,
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534734,
"mc2": 0.588506570085178,
"mc2_stderr": 0.015040018908256268
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623496
},
"harness|hellaswag|10": {
"acc": 0.67805218084047,
"acc_stderr": 0.0046626822330937834,
"acc_norm": 0.8641704839673372,
"acc_norm_stderr": 0.003419072480735362
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741706,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741706
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.033096151770590075,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.033096151770590075
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.03068302084323101,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.03068302084323101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321593,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0301176889295036,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0301176889295036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822523,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8862385321100917,
"acc_stderr": 0.013613614800232796,
"acc_norm": 0.8862385321100917,
"acc_norm_stderr": 0.013613614800232796
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.03019482399680448,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.03019482399680448
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5553072625698324,
"acc_stderr": 0.016619881988177015,
"acc_norm": 0.5553072625698324,
"acc_norm_stderr": 0.016619881988177015
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.02103851777015737,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.02103851777015737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5482398956975228,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.5482398956975228,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041496,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041496
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.017282760695167404,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.017282760695167404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721374,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721374
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.025801283475090496,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.025801283475090496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160875,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160875
},
"harness|truthfulqa:mc|0": {
"mc1": 0.423500611995104,
"mc1_stderr": 0.017297421448534734,
"mc2": 0.588506570085178,
"mc2_stderr": 0.015040018908256268
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_psmathur__orca_mini_v3_70b | 2023-08-27T12:40:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 1 | 0 | ---
pretty_name: Evaluation run of psmathur/orca_mini_v3_70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/orca_mini_v3_70b](https://huggingface.co/psmathur/orca_mini_v3_70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v3_70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T01:37:34.029105](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_70b/blob/main/results_2023-08-18T01%3A37%3A34.029105.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7010508529623596,\n\
\ \"acc_stderr\": 0.0309286120388273,\n \"acc_norm\": 0.7049679984523141,\n\
\ \"acc_norm_stderr\": 0.030896356315399304,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n\
\ \"acc_stderr\": 0.00459390260197934,\n \"acc_norm\": 0.8785102569209321,\n\
\ \"acc_norm_stderr\": 0.0032602788112468337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708052,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026783,\n \"\
acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026783\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"\
acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519513,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305733,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305733\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.0218552552634218,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.0218552552634218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n\
\ \"acc_stderr\": 0.01661139368726857,\n \"acc_norm\": 0.5575418994413408,\n\
\ \"acc_norm_stderr\": 0.01661139368726857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.021330868762127062,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.021330868762127062\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5851063829787234,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559973924380704,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.559973924380704,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427653,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427653\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n }\n}\n```"
repo_url: https://huggingface.co/psmathur/orca_mini_v3_70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:37:34.029105.parquet'
- config_name: results
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- results_2023-08-18T01:37:34.029105.parquet
- split: latest
path:
- results_2023-08-18T01:37:34.029105.parquet
---
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v3_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_70b](https://huggingface.co/psmathur/orca_mini_v3_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v3_70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T01:37:34.029105](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_70b/blob/main/results_2023-08-18T01%3A37%3A34.029105.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7010508529623596,
"acc_stderr": 0.0309286120388273,
"acc_norm": 0.7049679984523141,
"acc_norm_stderr": 0.030896356315399304,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.013796182947785562,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.6951802429794861,
"acc_stderr": 0.00459390260197934,
"acc_norm": 0.8785102569209321,
"acc_norm_stderr": 0.0032602788112468337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708052,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026783,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026783
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519513,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305733,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305733
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5575418994413408,
"acc_stderr": 0.01661139368726857,
"acc_norm": 0.5575418994413408,
"acc_norm_stderr": 0.01661139368726857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.021330868762127062,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.021330868762127062
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.559973924380704,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.559973924380704,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427653,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427653
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca | 2023-09-23T17:57:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/gpt-neo-1.3B-4bit-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/gpt-neo-1.3B-4bit-alpaca](https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T17:57:15.784929](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca/blob/main/results_2023-09-23T17-57-15.784929.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.00031446531194133983,\n \"f1\": 0.05118708053691287,\n\
\ \"f1_stderr\": 0.001257884278930374,\n \"acc\": 0.2821159149890526,\n\
\ \"acc_stderr\": 0.007628169555669113\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194133983,\n\
\ \"f1\": 0.05118708053691287,\n \"f1_stderr\": 0.001257884278930374\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674233\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5619573796369376,\n \"acc_stderr\": 0.013944181296470803\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|arc:challenge|25_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T17_57_15.784929
path:
- '**/details_harness|drop|3_2023-09-23T17-57-15.784929.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T17-57-15.784929.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T17_57_15.784929
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-57-15.784929.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T17-57-15.784929.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hellaswag|10_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T13:07:16.687815.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T13:07:16.687815.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T17_57_15.784929
path:
- '**/details_harness|winogrande|5_2023-09-23T17-57-15.784929.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T17-57-15.784929.parquet'
- config_name: results
data_files:
- split: 2023_08_18T13_07_16.687815
path:
- results_2023-08-18T13:07:16.687815.parquet
- split: 2023_09_23T17_57_15.784929
path:
- results_2023-09-23T17-57-15.784929.parquet
- split: latest
path:
- results_2023-09-23T17-57-15.784929.parquet
---
# Dataset Card for Evaluation run of TFLai/gpt-neo-1.3B-4bit-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/gpt-neo-1.3B-4bit-alpaca](https://huggingface.co/TFLai/gpt-neo-1.3B-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T17:57:15.784929](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt-neo-1.3B-4bit-alpaca/blob/main/results_2023-09-23T17-57-15.784929.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194133983,
"f1": 0.05118708053691287,
"f1_stderr": 0.001257884278930374,
"acc": 0.2821159149890526,
"acc_stderr": 0.007628169555669113
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194133983,
"f1": 0.05118708053691287,
"f1_stderr": 0.001257884278930374
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674233
},
"harness|winogrande|5": {
"acc": 0.5619573796369376,
"acc_stderr": 0.013944181296470803
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1 | 2023-08-27T12:40:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of YeungNLP/firefly-bloom-7b1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-bloom-7b1](https://huggingface.co/YeungNLP/firefly-bloom-7b1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:41:37.942439](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1/blob/main/results_2023-08-17T18%3A41%3A37.942439.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.273303830630283,\n\
\ \"acc_stderr\": 0.0322561180503685,\n \"acc_norm\": 0.2764238986705393,\n\
\ \"acc_norm_stderr\": 0.032258648048002526,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4083306889381139,\n\
\ \"mc2_stderr\": 0.014458999585465102\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3660409556313993,\n \"acc_stderr\": 0.014077223108470142,\n\
\ \"acc_norm\": 0.4044368600682594,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46634136626170086,\n\
\ \"acc_stderr\": 0.00497846269096692,\n \"acc_norm\": 0.6120294761999602,\n\
\ \"acc_norm_stderr\": 0.004862919176408075\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3212121212121212,\n \"acc_stderr\": 0.036462049632538136,\n\
\ \"acc_norm\": 0.3212121212121212,\n \"acc_norm_stderr\": 0.036462049632538136\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.03095405547036592,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.03095405547036592\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.30458715596330277,\n \"acc_stderr\": 0.019732299420354038,\n \"\
acc_norm\": 0.30458715596330277,\n \"acc_norm_stderr\": 0.019732299420354038\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.032365852526021574,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.032365852526021574\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n\
\ \"acc_stderr\": 0.016203792703197804,\n \"acc_norm\": 0.2886334610472541,\n\
\ \"acc_norm_stderr\": 0.016203792703197804\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826507,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826507\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n\
\ \"acc_stderr\": 0.01111171533610114,\n \"acc_norm\": 0.25358539765319427,\n\
\ \"acc_norm_stderr\": 0.01111171533610114\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.02714627193662517,\n\
\ \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.02714627193662517\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322267,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.031414708025865885,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.031414708025865885\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.02970528405677243,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.02970528405677243\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4083306889381139,\n\
\ \"mc2_stderr\": 0.014458999585465102\n }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-bloom-7b1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:41:37.942439.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- results_2023-08-17T18:41:37.942439.parquet
- split: latest
path:
- results_2023-08-17T18:41:37.942439.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-bloom-7b1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-bloom-7b1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-bloom-7b1](https://huggingface.co/YeungNLP/firefly-bloom-7b1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:41:37.942439](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1/blob/main/results_2023-08-17T18%3A41%3A37.942439.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.273303830630283,
"acc_stderr": 0.0322561180503685,
"acc_norm": 0.2764238986705393,
"acc_norm_stderr": 0.032258648048002526,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4083306889381139,
"mc2_stderr": 0.014458999585465102
},
"harness|arc:challenge|25": {
"acc": 0.3660409556313993,
"acc_stderr": 0.014077223108470142,
"acc_norm": 0.4044368600682594,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.46634136626170086,
"acc_stderr": 0.00497846269096692,
"acc_norm": 0.6120294761999602,
"acc_norm_stderr": 0.004862919176408075
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3212121212121212,
"acc_stderr": 0.036462049632538136,
"acc_norm": 0.3212121212121212,
"acc_norm_stderr": 0.036462049632538136
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.03095405547036592,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.03095405547036592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560486,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30458715596330277,
"acc_stderr": 0.019732299420354038,
"acc_norm": 0.30458715596330277,
"acc_norm_stderr": 0.019732299420354038
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.016203792703197804,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.016203792703197804
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.025553169991826507,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.025553169991826507
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.01111171533610114,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.01111171533610114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.02714627193662517,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.02714627193662517
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322267,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.02970528405677243,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.02970528405677243
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4083306889381139,
"mc2_stderr": 0.014458999585465102
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b | 2023-09-17T11:42:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/samantha-1.1-llama-33b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-1.1-llama-33b](https://huggingface.co/ehartford/samantha-1.1-llama-33b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T11:42:44.859774](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b/blob/main/results_2023-09-17T11-42-44.859774.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20994127516778524,\n\
\ \"em_stderr\": 0.004170789326061049,\n \"f1\": 0.2829341442953027,\n\
\ \"f1_stderr\": 0.004181823285876536,\n \"acc\": 0.4024903466008606,\n\
\ \"acc_stderr\": 0.008664723950310687\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.20994127516778524,\n \"em_stderr\": 0.004170789326061049,\n\
\ \"f1\": 0.2829341442953027,\n \"f1_stderr\": 0.004181823285876536\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \
\ \"acc_stderr\": 0.00540943973697051\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650865\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-1.1-llama-33b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T11_42_44.859774
path:
- '**/details_harness|drop|3_2023-09-17T11-42-44.859774.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T11-42-44.859774.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T11_42_44.859774
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-44.859774.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-42-44.859774.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:31:51.159426.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:31:51.159426.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T11_42_44.859774
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-44.859774.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T11-42-44.859774.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_31_51.159426
path:
- results_2023-08-18T14:31:51.159426.parquet
- split: 2023_09_17T11_42_44.859774
path:
- results_2023-09-17T11-42-44.859774.parquet
- split: latest
path:
- results_2023-09-17T11-42-44.859774.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-1.1-llama-33b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-1.1-llama-33b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-1.1-llama-33b](https://huggingface.co/ehartford/samantha-1.1-llama-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T11:42:44.859774](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-1.1-llama-33b/blob/main/results_2023-09-17T11-42-44.859774.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.20994127516778524,
"em_stderr": 0.004170789326061049,
"f1": 0.2829341442953027,
"f1_stderr": 0.004181823285876536,
"acc": 0.4024903466008606,
"acc_stderr": 0.008664723950310687
},
"harness|drop|3": {
"em": 0.20994127516778524,
"em_stderr": 0.004170789326061049,
"f1": 0.2829341442953027,
"f1_stderr": 0.004181823285876536
},
"harness|gsm8k|5": {
"acc": 0.0401819560272934,
"acc_stderr": 0.00540943973697051
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650865
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora | 2023-08-27T12:40:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/minotaur-llama2-13b-qlora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/minotaur-llama2-13b-qlora](https://huggingface.co/ehartford/minotaur-llama2-13b-qlora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T01:34:00.982275](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora/blob/main/results_2023-08-18T01%3A34%3A00.982275.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5598972092657262,\n\
\ \"acc_stderr\": 0.03439801562067804,\n \"acc_norm\": 0.5638773775015778,\n\
\ \"acc_norm_stderr\": 0.034377595991925414,\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920616,\n \"mc2\": 0.45574844972222117,\n\
\ \"mc2_stderr\": 0.015126602239822708\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182526,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946704\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6226847241585342,\n\
\ \"acc_stderr\": 0.00483724201519112,\n \"acc_norm\": 0.8242381995618403,\n\
\ \"acc_norm_stderr\": 0.003798395055021535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.031544498882702846,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.031544498882702846\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736236,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736236\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871913,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871913\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261836,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7573435504469987,\n\
\ \"acc_stderr\": 0.01532988894089986,\n \"acc_norm\": 0.7573435504469987,\n\
\ \"acc_norm_stderr\": 0.01532988894089986\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179602,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179602\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.0276841818833029,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.0276841818833029\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.026959344518747784,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.026959344518747784\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284062,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4015645371577575,\n\
\ \"acc_stderr\": 0.012520315120147119,\n \"acc_norm\": 0.4015645371577575,\n\
\ \"acc_norm_stderr\": 0.012520315120147119\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5514705882352942,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.5514705882352942,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5310457516339869,\n \"acc_stderr\": 0.02018880445636189,\n \
\ \"acc_norm\": 0.5310457516339869,\n \"acc_norm_stderr\": 0.02018880445636189\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920616,\n \"mc2\": 0.45574844972222117,\n\
\ \"mc2_stderr\": 0.015126602239822708\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/minotaur-llama2-13b-qlora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:34:00.982275.parquet'
- config_name: results
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- results_2023-08-18T01:34:00.982275.parquet
- split: latest
path:
- results_2023-08-18T01:34:00.982275.parquet
---
# Dataset Card for Evaluation run of ehartford/minotaur-llama2-13b-qlora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/minotaur-llama2-13b-qlora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/minotaur-llama2-13b-qlora](https://huggingface.co/ehartford/minotaur-llama2-13b-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T01:34:00.982275](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora/blob/main/results_2023-08-18T01%3A34%3A00.982275.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5598972092657262,
"acc_stderr": 0.03439801562067804,
"acc_norm": 0.5638773775015778,
"acc_norm_stderr": 0.034377595991925414,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920616,
"mc2": 0.45574844972222117,
"mc2_stderr": 0.015126602239822708
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182526,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946704
},
"harness|hellaswag|10": {
"acc": 0.6226847241585342,
"acc_stderr": 0.00483724201519112,
"acc_norm": 0.8242381995618403,
"acc_norm_stderr": 0.003798395055021535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.031544498882702846,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.031544498882702846
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624528,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736236,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871913,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871913
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080445,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261836,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7573435504469987,
"acc_stderr": 0.01532988894089986,
"acc_norm": 0.7573435504469987,
"acc_norm_stderr": 0.01532988894089986
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179602,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179602
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.0276841818833029,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.0276841818833029
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.026959344518747784,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.026959344518747784
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284062,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4015645371577575,
"acc_stderr": 0.012520315120147119,
"acc_norm": 0.4015645371577575,
"acc_norm_stderr": 0.012520315120147119
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5514705882352942,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.5514705882352942,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5310457516339869,
"acc_stderr": 0.02018880445636189,
"acc_norm": 0.5310457516339869,
"acc_norm_stderr": 0.02018880445636189
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920616,
"mc2": 0.45574844972222117,
"mc2_stderr": 0.015126602239822708
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA | 2023-08-27T12:40:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of v2ray/LLaMA-2-Wizard-70B-QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [v2ray/LLaMA-2-Wizard-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T07:09:43.451689](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA/blob/main/results_2023-08-18T07%3A09%3A43.451689.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6888919374287494,\n\
\ \"acc_stderr\": 0.03126423518242464,\n \"acc_norm\": 0.6925236847264156,\n\
\ \"acc_norm_stderr\": 0.031236555638117565,\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.01737969755543745,\n \"mc2\": 0.6172087939156057,\n\
\ \"mc2_stderr\": 0.015172661646651054\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.01399805690262019,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.01365998089427737\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6948814977096196,\n\
\ \"acc_stderr\": 0.004595165551383618,\n \"acc_norm\": 0.8750248954391555,\n\
\ \"acc_norm_stderr\": 0.0033001484456091326\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948607,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948607\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.02203721734026782,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.02203721734026782\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.023119362758232304,\n\
\ \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.023119362758232304\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606646,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606646\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.02772206549336126,\n \
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.02772206549336126\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8917431192660551,\n\
\ \"acc_stderr\": 0.013321348447611759,\n \"acc_norm\": 0.8917431192660551,\n\
\ \"acc_norm_stderr\": 0.013321348447611759\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653063,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237102,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237102\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8531289910600255,\n\
\ \"acc_stderr\": 0.012658201736147292,\n \"acc_norm\": 0.8531289910600255,\n\
\ \"acc_norm_stderr\": 0.012658201736147292\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4972067039106145,\n\
\ \"acc_stderr\": 0.01672224059549172,\n \"acc_norm\": 0.4972067039106145,\n\
\ \"acc_norm_stderr\": 0.01672224059549172\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7652733118971061,\n\
\ \"acc_stderr\": 0.02407180588767704,\n \"acc_norm\": 0.7652733118971061,\n\
\ \"acc_norm_stderr\": 0.02407180588767704\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396154,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5547588005215124,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.5547588005215124,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103135,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103135\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7516339869281046,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.025172984350155754,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.025172984350155754\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.01737969755543745,\n \"mc2\": 0.6172087939156057,\n\
\ \"mc2_stderr\": 0.015172661646651054\n }\n}\n```"
repo_url: https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:09:43.451689.parquet'
- config_name: results
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- results_2023-08-18T07:09:43.451689.parquet
- split: latest
path:
- results_2023-08-18T07:09:43.451689.parquet
---
# Dataset Card for Evaluation run of v2ray/LLaMA-2-Wizard-70B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [v2ray/LLaMA-2-Wizard-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T07:09:43.451689](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA/blob/main/results_2023-08-18T07%3A09%3A43.451689.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6888919374287494,
"acc_stderr": 0.03126423518242464,
"acc_norm": 0.6925236847264156,
"acc_norm_stderr": 0.031236555638117565,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.01737969755543745,
"mc2": 0.6172087939156057,
"mc2_stderr": 0.015172661646651054
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.01399805690262019,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.01365998089427737
},
"harness|hellaswag|10": {
"acc": 0.6948814977096196,
"acc_stderr": 0.004595165551383618,
"acc_norm": 0.8750248954391555,
"acc_norm_stderr": 0.0033001484456091326
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948607,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948607
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026782,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026782
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.023119362758232304,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.023119362758232304
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606646,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606646
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.02772206549336126,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.02772206549336126
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611759,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611759
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237102,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237102
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8531289910600255,
"acc_stderr": 0.012658201736147292,
"acc_norm": 0.8531289910600255,
"acc_norm_stderr": 0.012658201736147292
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617893,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617893
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4972067039106145,
"acc_stderr": 0.01672224059549172,
"acc_norm": 0.4972067039106145,
"acc_norm_stderr": 0.01672224059549172
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7652733118971061,
"acc_stderr": 0.02407180588767704,
"acc_norm": 0.7652733118971061,
"acc_norm_stderr": 0.02407180588767704
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396154,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5547588005215124,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.5547588005215124,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103135,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103135
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.025172984350155754,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.025172984350155754
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.01737969755543745,
"mc2": 0.6172087939156057,
"mc2_stderr": 0.015172661646651054
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_digitous__13B-Chimera | 2023-08-27T12:40:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of digitous/13B-Chimera
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [digitous/13B-Chimera](https://huggingface.co/digitous/13B-Chimera) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_digitous__13B-Chimera\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T15:36:44.224352](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__13B-Chimera/blob/main/results_2023-08-17T15%3A36%3A44.224352.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.501681989763272,\n\
\ \"acc_stderr\": 0.03489701828715803,\n \"acc_norm\": 0.5052950528804787,\n\
\ \"acc_norm_stderr\": 0.03487945109557973,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677147,\n \"mc2\": 0.5259120317801959,\n\
\ \"mc2_stderr\": 0.015140404580264173\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n\
\ \"acc_norm\": 0.575938566552901,\n \"acc_norm_stderr\": 0.014441889627464398\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6163114917347142,\n\
\ \"acc_stderr\": 0.004852896681736759,\n \"acc_norm\": 0.8149770961959769,\n\
\ \"acc_norm_stderr\": 0.0038752253693657315\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.040925639582376556,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.040925639582376556\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5548387096774193,\n\
\ \"acc_stderr\": 0.028272410186214906,\n \"acc_norm\": 0.5548387096774193,\n\
\ \"acc_norm_stderr\": 0.028272410186214906\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.0346488167501634,\n \"acc_norm\"\
: 0.6161616161616161,\n \"acc_norm_stderr\": 0.0346488167501634\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.03340361906276587,\n \
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.03340361906276587\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097856,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097856\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.689908256880734,\n \"acc_stderr\": 0.019830849684439752,\n \"\
acc_norm\": 0.689908256880734,\n \"acc_norm_stderr\": 0.019830849684439752\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3101851851851852,\n \"acc_stderr\": 0.0315469628565663,\n \"acc_norm\"\
: 0.3101851851851852,\n \"acc_norm_stderr\": 0.0315469628565663\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6715686274509803,\n\
\ \"acc_stderr\": 0.03296245110172229,\n \"acc_norm\": 0.6715686274509803,\n\
\ \"acc_norm_stderr\": 0.03296245110172229\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138608,\n\
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138608\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.03922378290610991,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.03922378290610991\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7062579821200511,\n\
\ \"acc_stderr\": 0.016287759388491654,\n \"acc_norm\": 0.7062579821200511,\n\
\ \"acc_norm_stderr\": 0.016287759388491654\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.026864624366756646,\n\
\ \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.026864624366756646\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574917,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626595,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626595\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.028345045864840625,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.028345045864840625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.02771666165019404,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.02771666165019404\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4015645371577575,\n\
\ \"acc_stderr\": 0.012520315120147106,\n \"acc_norm\": 0.4015645371577575,\n\
\ \"acc_norm_stderr\": 0.012520315120147106\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.03030625772246832,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.03030625772246832\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5228758169934641,\n \"acc_stderr\": 0.020206653187884786,\n \
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.020206653187884786\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.032941184790540944,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.032941184790540944\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677147,\n \"mc2\": 0.5259120317801959,\n\
\ \"mc2_stderr\": 0.015140404580264173\n }\n}\n```"
repo_url: https://huggingface.co/digitous/13B-Chimera
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:36:44.224352.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:36:44.224352.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:36:44.224352.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:36:44.224352.parquet'
- config_name: results
data_files:
- split: 2023_08_17T15_36_44.224352
path:
- results_2023-08-17T15:36:44.224352.parquet
- split: latest
path:
- results_2023-08-17T15:36:44.224352.parquet
---
# Dataset Card for Evaluation run of digitous/13B-Chimera
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/digitous/13B-Chimera
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [digitous/13B-Chimera](https://huggingface.co/digitous/13B-Chimera) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_digitous__13B-Chimera",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T15:36:44.224352](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__13B-Chimera/blob/main/results_2023-08-17T15%3A36%3A44.224352.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.501681989763272,
"acc_stderr": 0.03489701828715803,
"acc_norm": 0.5052950528804787,
"acc_norm_stderr": 0.03487945109557973,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677147,
"mc2": 0.5259120317801959,
"mc2_stderr": 0.015140404580264173
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.575938566552901,
"acc_norm_stderr": 0.014441889627464398
},
"harness|hellaswag|10": {
"acc": 0.6163114917347142,
"acc_stderr": 0.004852896681736759,
"acc_norm": 0.8149770961959769,
"acc_norm_stderr": 0.0038752253693657315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.040925639582376556,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.040925639582376556
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5548387096774193,
"acc_stderr": 0.028272410186214906,
"acc_norm": 0.5548387096774193,
"acc_norm_stderr": 0.028272410186214906
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.0346488167501634,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.0346488167501634
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.03340361906276587,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.03340361906276587
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097856,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097856
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.689908256880734,
"acc_stderr": 0.019830849684439752,
"acc_norm": 0.689908256880734,
"acc_norm_stderr": 0.019830849684439752
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3101851851851852,
"acc_stderr": 0.0315469628565663,
"acc_norm": 0.3101851851851852,
"acc_norm_stderr": 0.0315469628565663
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.03296245110172229,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.03296245110172229
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138608,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138608
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.03922378290610991,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.03922378290610991
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7062579821200511,
"acc_stderr": 0.016287759388491654,
"acc_norm": 0.7062579821200511,
"acc_norm_stderr": 0.016287759388491654
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.026864624366756646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.026864624366756646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574917,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.028509807802626595,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.028509807802626595
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.028345045864840625,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.028345045864840625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.02771666165019404,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.02771666165019404
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4015645371577575,
"acc_stderr": 0.012520315120147106,
"acc_norm": 0.4015645371577575,
"acc_norm_stderr": 0.012520315120147106
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.03030625772246832,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.03030625772246832
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.020206653187884786,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.020206653187884786
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.032941184790540944,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.032941184790540944
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677147,
"mc2": 0.5259120317801959,
"mc2_stderr": 0.015140404580264173
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-22b-Prototype | 2023-08-27T12:40:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of The-Face-Of-Goonery/Huginn-22b-Prototype
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [The-Face-Of-Goonery/Huginn-22b-Prototype](https://huggingface.co/The-Face-Of-Goonery/Huginn-22b-Prototype)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-22b-Prototype\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T17:52:21.766212](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-22b-Prototype/blob/main/results_2023-08-17T17%3A52%3A21.766212.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5005161617279109,\n\
\ \"acc_stderr\": 0.03516843144806568,\n \"acc_norm\": 0.5046999837864469,\n\
\ \"acc_norm_stderr\": 0.035150093964360585,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5211078120321375,\n\
\ \"mc2_stderr\": 0.015702689606263226\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294317,\n\
\ \"acc_norm\": 0.5767918088737202,\n \"acc_norm_stderr\": 0.014438036220848029\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6078470424218283,\n\
\ \"acc_stderr\": 0.004872326888655524,\n \"acc_norm\": 0.8069109739095798,\n\
\ \"acc_norm_stderr\": 0.003939155484500651\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.03063562795796182,\n\
\ \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.03063562795796182\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.043255060420170854,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.043255060420170854\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n\
\ \"acc_stderr\": 0.02779187875313227,\n \"acc_norm\": 0.6064516129032258,\n\
\ \"acc_norm_stderr\": 0.02779187875313227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036545,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.031821550509166456,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.031821550509166456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n\
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6642201834862386,\n \"acc_stderr\": 0.020248081396752927,\n \"\
acc_norm\": 0.6642201834862386,\n \"acc_norm_stderr\": 0.020248081396752927\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591518,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591518\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6497890295358649,\n \"acc_stderr\": 0.031052391937584346,\n \
\ \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.031052391937584346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012349,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012349\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674054,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674054\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.669220945083014,\n\
\ \"acc_stderr\": 0.016824818462563746,\n \"acc_norm\": 0.669220945083014,\n\
\ \"acc_norm_stderr\": 0.016824818462563746\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.026720034380514995,\n\
\ \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.026720034380514995\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n\
\ \"acc_stderr\": 0.01405431493561455,\n \"acc_norm\": 0.22905027932960895,\n\
\ \"acc_norm_stderr\": 0.01405431493561455\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.028358956313423545,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.028358956313423545\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.02878222756134724,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.02878222756134724\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35919165580182527,\n\
\ \"acc_stderr\": 0.012253386187584252,\n \"acc_norm\": 0.35919165580182527,\n\
\ \"acc_norm_stderr\": 0.012253386187584252\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016636,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016636\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46405228758169936,\n \"acc_stderr\": 0.02017548876548405,\n \
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.02017548876548405\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.032357437893550424,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.032357437893550424\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611548,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611548\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5211078120321375,\n\
\ \"mc2_stderr\": 0.015702689606263226\n }\n}\n```"
repo_url: https://huggingface.co/The-Face-Of-Goonery/Huginn-22b-Prototype
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|arc:challenge|25_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hellaswag|10_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:52:21.766212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:52:21.766212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T17:52:21.766212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T17:52:21.766212.parquet'
- config_name: results
data_files:
- split: 2023_08_17T17_52_21.766212
path:
- results_2023-08-17T17:52:21.766212.parquet
- split: latest
path:
- results_2023-08-17T17:52:21.766212.parquet
---
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-22b-Prototype
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/The-Face-Of-Goonery/Huginn-22b-Prototype
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-22b-Prototype](https://huggingface.co/The-Face-Of-Goonery/Huginn-22b-Prototype) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-22b-Prototype",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T17:52:21.766212](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-22b-Prototype/blob/main/results_2023-08-17T17%3A52%3A21.766212.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5005161617279109,
"acc_stderr": 0.03516843144806568,
"acc_norm": 0.5046999837864469,
"acc_norm_stderr": 0.035150093964360585,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5211078120321375,
"mc2_stderr": 0.015702689606263226
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294317,
"acc_norm": 0.5767918088737202,
"acc_norm_stderr": 0.014438036220848029
},
"harness|hellaswag|10": {
"acc": 0.6078470424218283,
"acc_stderr": 0.004872326888655524,
"acc_norm": 0.8069109739095798,
"acc_norm_stderr": 0.003939155484500651
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.03063562795796182,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.03063562795796182
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.043255060420170854,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.043255060420170854
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.02779187875313227,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.02779187875313227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036545,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.037818873532059816,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.037818873532059816
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.031821550509166456,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.031821550509166456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6642201834862386,
"acc_stderr": 0.020248081396752927,
"acc_norm": 0.6642201834862386,
"acc_norm_stderr": 0.020248081396752927
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591518,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591518
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.031052391937584346,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.031052391937584346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199986,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199986
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012349,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012349
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674054,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674054
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.669220945083014,
"acc_stderr": 0.016824818462563746,
"acc_norm": 0.669220945083014,
"acc_norm_stderr": 0.016824818462563746
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.026720034380514995,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.026720034380514995
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.01405431493561455,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.01405431493561455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.028358956313423545,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.028358956313423545
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.02878222756134724,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.02878222756134724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35919165580182527,
"acc_stderr": 0.012253386187584252,
"acc_norm": 0.35919165580182527,
"acc_norm_stderr": 0.012253386187584252
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.02017548876548405,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.02017548876548405
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.032357437893550424,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.032357437893550424
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611548,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611548
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5211078120321375,
"mc2_stderr": 0.015702689606263226
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_facebook__opt-1.3b | 2023-08-27T12:40:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of facebook/opt-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [facebook/opt-1.3b](https://huggingface.co/facebook/opt-1.3b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_facebook__opt-1.3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T14:50:30.777525](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-1.3b/blob/main/results_2023-08-18T14%3A50%3A30.777525.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25268905375730805,\n\
\ \"acc_stderr\": 0.031459779978536884,\n \"acc_norm\": 0.2554147061503054,\n\
\ \"acc_norm_stderr\": 0.031467922404608224,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104184,\n \"mc2\": 0.38710675790483107,\n\
\ \"mc2_stderr\": 0.014210843908178184\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.26535836177474403,\n \"acc_stderr\": 0.012902554762313967,\n\
\ \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382318\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41435968930491934,\n\
\ \"acc_stderr\": 0.004916043838455668,\n \"acc_norm\": 0.545309699263095,\n\
\ \"acc_norm_stderr\": 0.004969251445596341\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.02661648298050171,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.02661648298050171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.046170348270067184,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.046170348270067184\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412417,\n\
\ \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412417\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n\
\ \"acc_stderr\": 0.031298431857438094,\n \"acc_norm\": 0.14285714285714285,\n\
\ \"acc_norm_stderr\": 0.031298431857438094\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.23548387096774193,\n \"acc_stderr\": 0.024137632429337717,\n \"\
acc_norm\": 0.23548387096774193,\n \"acc_norm_stderr\": 0.024137632429337717\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114454,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114454\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565318,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204423,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204423\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267624,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267624\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.026064313406304527,\n\
\ \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.026064313406304527\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25688073394495414,\n \"acc_stderr\": 0.018732492928342444,\n \"\
acc_norm\": 0.25688073394495414,\n \"acc_norm_stderr\": 0.018732492928342444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.03070137211151092,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.03070137211151092\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.18137254901960784,\n \"acc_stderr\": 0.02704462171947407,\n \"\
acc_norm\": 0.18137254901960784,\n \"acc_norm_stderr\": 0.02704462171947407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24050632911392406,\n \"acc_stderr\": 0.02782078198114968,\n \
\ \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.02782078198114968\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.02991858670779882,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.02991858670779882\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.02812096650391441,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.02812096650391441\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n\
\ \"acc_stderr\": 0.015745497169049053,\n \"acc_norm\": 0.26309067688378035,\n\
\ \"acc_norm_stderr\": 0.015745497169049053\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19614147909967847,\n\
\ \"acc_stderr\": 0.022552447780478033,\n \"acc_norm\": 0.19614147909967847,\n\
\ \"acc_norm_stderr\": 0.022552447780478033\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266733,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266733\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23142112125162972,\n\
\ \"acc_stderr\": 0.010771461711576462,\n \"acc_norm\": 0.23142112125162972,\n\
\ \"acc_norm_stderr\": 0.010771461711576462\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27124183006535946,\n \"acc_stderr\": 0.01798661530403031,\n \
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.01798661530403031\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.0265370453121453,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.0265370453121453\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104184,\n \"mc2\": 0.38710675790483107,\n\
\ \"mc2_stderr\": 0.014210843908178184\n }\n}\n```"
repo_url: https://huggingface.co/facebook/opt-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:50:30.777525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:50:30.777525.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:50:30.777525.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:50:30.777525.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_50_30.777525
path:
- results_2023-08-18T14:50:30.777525.parquet
- split: latest
path:
- results_2023-08-18T14:50:30.777525.parquet
---
# Dataset Card for Evaluation run of facebook/opt-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/facebook/opt-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [facebook/opt-1.3b](https://huggingface.co/facebook/opt-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_facebook__opt-1.3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T14:50:30.777525](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-1.3b/blob/main/results_2023-08-18T14%3A50%3A30.777525.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25268905375730805,
"acc_stderr": 0.031459779978536884,
"acc_norm": 0.2554147061503054,
"acc_norm_stderr": 0.031467922404608224,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104184,
"mc2": 0.38710675790483107,
"mc2_stderr": 0.014210843908178184
},
"harness|arc:challenge|25": {
"acc": 0.26535836177474403,
"acc_stderr": 0.012902554762313967,
"acc_norm": 0.295221843003413,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.41435968930491934,
"acc_stderr": 0.004916043838455668,
"acc_norm": 0.545309699263095,
"acc_norm_stderr": 0.004969251445596341
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.02661648298050171,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.02661648298050171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.046170348270067184,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.046170348270067184
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412417,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.031298431857438094,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.031298431857438094
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.024137632429337717,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.024137632429337717
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114454,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114454
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565318,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204423,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204423
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267624,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267624
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20168067226890757,
"acc_stderr": 0.026064313406304527,
"acc_norm": 0.20168067226890757,
"acc_norm_stderr": 0.026064313406304527
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25688073394495414,
"acc_stderr": 0.018732492928342444,
"acc_norm": 0.25688073394495414,
"acc_norm_stderr": 0.018732492928342444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.03070137211151092,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.03070137211151092
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.18137254901960784,
"acc_stderr": 0.02704462171947407,
"acc_norm": 0.18137254901960784,
"acc_norm_stderr": 0.02704462171947407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.02991858670779882,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.02991858670779882
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02812096650391441,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02812096650391441
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049053,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049053
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19614147909967847,
"acc_stderr": 0.022552447780478033,
"acc_norm": 0.19614147909967847,
"acc_norm_stderr": 0.022552447780478033
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266733,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266733
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23142112125162972,
"acc_stderr": 0.010771461711576462,
"acc_norm": 0.23142112125162972,
"acc_norm_stderr": 0.010771461711576462
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.01798661530403031,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.01798661530403031
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.0265370453121453,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.0265370453121453
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104184,
"mc2": 0.38710675790483107,
"mc2_stderr": 0.014210843908178184
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_facebook__xglm-1.7B | 2023-08-27T12:40:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of facebook/xglm-1.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [facebook/xglm-1.7B](https://huggingface.co/facebook/xglm-1.7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_facebook__xglm-1.7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T06:53:01.114817](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__xglm-1.7B/blob/main/results_2023-08-18T06%3A53%3A01.114817.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25240001646517257,\n\
\ \"acc_stderr\": 0.03129950810334678,\n \"acc_norm\": 0.2546195139198785,\n\
\ \"acc_norm_stderr\": 0.03131375977392343,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487298,\n \"mc2\": 0.37206341516963043,\n\
\ \"mc2_stderr\": 0.014131533976938119\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22098976109215018,\n \"acc_stderr\": 0.012124929206818258,\n\
\ \"acc_norm\": 0.25853242320819114,\n \"acc_norm_stderr\": 0.012794553754288675\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3633738299143597,\n\
\ \"acc_stderr\": 0.004799882248494812,\n \"acc_norm\": 0.4567815176259709,\n\
\ \"acc_norm_stderr\": 0.00497110626504656\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173042,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173042\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n\
\ \"acc_stderr\": 0.02564938106302926,\n \"acc_norm\": 0.2838709677419355,\n\
\ \"acc_norm_stderr\": 0.02564938106302926\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.02066059748502693,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.02066059748502693\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715473,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715473\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780306,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035307,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23766816143497757,\n\
\ \"acc_stderr\": 0.02856807946471425,\n \"acc_norm\": 0.23766816143497757,\n\
\ \"acc_norm_stderr\": 0.02856807946471425\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.16793893129770993,\n \"acc_stderr\": 0.03278548537343138,\n\
\ \"acc_norm\": 0.16793893129770993,\n \"acc_norm_stderr\": 0.03278548537343138\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2669220945083014,\n\
\ \"acc_stderr\": 0.01581845089477755,\n \"acc_norm\": 0.2669220945083014,\n\
\ \"acc_norm_stderr\": 0.01581845089477755\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912248,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912248\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.32098765432098764,\n \"acc_stderr\": 0.02597656601086273,\n\
\ \"acc_norm\": 0.32098765432098764,\n \"acc_norm_stderr\": 0.02597656601086273\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590634,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590634\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.03023375855159646,\n\
\ \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.03023375855159646\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528027,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528027\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.02653704531214531,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.02653704531214531\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.014509045171487298,\n \"mc2\": 0.37206341516963043,\n\
\ \"mc2_stderr\": 0.014131533976938119\n }\n}\n```"
repo_url: https://huggingface.co/facebook/xglm-1.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|arc:challenge|25_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hellaswag|10_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:53:01.114817.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T06:53:01.114817.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T06:53:01.114817.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T06:53:01.114817.parquet'
- config_name: results
data_files:
- split: 2023_08_18T06_53_01.114817
path:
- results_2023-08-18T06:53:01.114817.parquet
- split: latest
path:
- results_2023-08-18T06:53:01.114817.parquet
---
# Dataset Card for Evaluation run of facebook/xglm-1.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/facebook/xglm-1.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [facebook/xglm-1.7B](https://huggingface.co/facebook/xglm-1.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_facebook__xglm-1.7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T06:53:01.114817](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__xglm-1.7B/blob/main/results_2023-08-18T06%3A53%3A01.114817.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25240001646517257,
"acc_stderr": 0.03129950810334678,
"acc_norm": 0.2546195139198785,
"acc_norm_stderr": 0.03131375977392343,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487298,
"mc2": 0.37206341516963043,
"mc2_stderr": 0.014131533976938119
},
"harness|arc:challenge|25": {
"acc": 0.22098976109215018,
"acc_stderr": 0.012124929206818258,
"acc_norm": 0.25853242320819114,
"acc_norm_stderr": 0.012794553754288675
},
"harness|hellaswag|10": {
"acc": 0.3633738299143597,
"acc_stderr": 0.004799882248494812,
"acc_norm": 0.4567815176259709,
"acc_norm_stderr": 0.00497110626504656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173042,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173042
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.0281854413012341,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.0281854413012341
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.02066059748502693,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.02066059748502693
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715473,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715473
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780306,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23766816143497757,
"acc_stderr": 0.02856807946471425,
"acc_norm": 0.23766816143497757,
"acc_norm_stderr": 0.02856807946471425
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.16793893129770993,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.16793893129770993,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2669220945083014,
"acc_stderr": 0.01581845089477755,
"acc_norm": 0.2669220945083014,
"acc_norm_stderr": 0.01581845089477755
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912248,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912248
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32098765432098764,
"acc_stderr": 0.02597656601086273,
"acc_norm": 0.32098765432098764,
"acc_norm_stderr": 0.02597656601086273
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590634,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590634
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.03023375855159646,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.03023375855159646
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528027,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528027
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.02653704531214531,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.02653704531214531
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487298,
"mc2": 0.37206341516963043,
"mc2_stderr": 0.014131533976938119
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics | 2023-09-17T20:39:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Harshvir/Llama-2-7B-physics
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Harshvir/Llama-2-7B-physics](https://huggingface.co/Harshvir/Llama-2-7B-physics)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T20:39:36.366627](https://huggingface.co/datasets/open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics/blob/main/results_2023-09-17T20-39-36.366627.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03680788590604027,\n\
\ \"em_stderr\": 0.0019282642409219751,\n \"f1\": 0.10780620805369148,\n\
\ \"f1_stderr\": 0.0024191974799882767,\n \"acc\": 0.39476463537886264,\n\
\ \"acc_stderr\": 0.009842042454929716\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03680788590604027,\n \"em_stderr\": 0.0019282642409219751,\n\
\ \"f1\": 0.10780620805369148,\n \"f1_stderr\": 0.0024191974799882767\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07050796057619409,\n \
\ \"acc_stderr\": 0.007051543813983609\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875824\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Harshvir/Llama-2-7B-physics
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|arc:challenge|25_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T20_39_36.366627
path:
- '**/details_harness|drop|3_2023-09-17T20-39-36.366627.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T20-39-36.366627.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T20_39_36.366627
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-39-36.366627.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T20-39-36.366627.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hellaswag|10_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T21:02:56.107134.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T21:02:56.107134.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T20_39_36.366627
path:
- '**/details_harness|winogrande|5_2023-09-17T20-39-36.366627.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T20-39-36.366627.parquet'
- config_name: results
data_files:
- split: 2023_08_17T21_02_56.107134
path:
- results_2023-08-17T21:02:56.107134.parquet
- split: 2023_09_17T20_39_36.366627
path:
- results_2023-09-17T20-39-36.366627.parquet
- split: latest
path:
- results_2023-09-17T20-39-36.366627.parquet
---
# Dataset Card for Evaluation run of Harshvir/Llama-2-7B-physics
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Harshvir/Llama-2-7B-physics
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Harshvir/Llama-2-7B-physics](https://huggingface.co/Harshvir/Llama-2-7B-physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T20:39:36.366627](https://huggingface.co/datasets/open-llm-leaderboard/details_Harshvir__Llama-2-7B-physics/blob/main/results_2023-09-17T20-39-36.366627.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03680788590604027,
"em_stderr": 0.0019282642409219751,
"f1": 0.10780620805369148,
"f1_stderr": 0.0024191974799882767,
"acc": 0.39476463537886264,
"acc_stderr": 0.009842042454929716
},
"harness|drop|3": {
"em": 0.03680788590604027,
"em_stderr": 0.0019282642409219751,
"f1": 0.10780620805369148,
"f1_stderr": 0.0024191974799882767
},
"harness|gsm8k|5": {
"acc": 0.07050796057619409,
"acc_stderr": 0.007051543813983609
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875824
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v3 | 2023-08-27T12:41:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheTravellingEngineer/llama2-7b-chat-hf-v3](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:37:31.585910](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v3/blob/main/results_2023-08-17T18%3A37%3A31.585910.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4560481833979,\n\
\ \"acc_stderr\": 0.03526137828710552,\n \"acc_norm\": 0.46001212718662227,\n\
\ \"acc_norm_stderr\": 0.035249446836368305,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.38312967353704325,\n\
\ \"mc2_stderr\": 0.01380374942810861\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47525597269624575,\n \"acc_stderr\": 0.014593487694937738,\n\
\ \"acc_norm\": 0.5196245733788396,\n \"acc_norm_stderr\": 0.014600132075947087\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5774746066520613,\n\
\ \"acc_stderr\": 0.00492951701150822,\n \"acc_norm\": 0.7669786895040829,\n\
\ \"acc_norm_stderr\": 0.004218917037002668\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270658,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270658\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.45806451612903226,\n \"acc_stderr\": 0.028343787250540636,\n \"\
acc_norm\": 0.45806451612903226,\n \"acc_norm_stderr\": 0.028343787250540636\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6735751295336787,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.6735751295336787,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.02512465352588513,\n\
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.02512465352588513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6036697247706422,\n\
\ \"acc_stderr\": 0.020971469947900532,\n \"acc_norm\": 0.6036697247706422,\n\
\ \"acc_norm_stderr\": 0.020971469947900532\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.02876511171804696,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.02876511171804696\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5392156862745098,\n \"acc_stderr\": 0.03498501649369527,\n \"\
acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5991561181434599,\n \"acc_stderr\": 0.031900803894732356,\n \
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.031900803894732356\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.048979577377811674,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.048979577377811674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6206896551724138,\n\
\ \"acc_stderr\": 0.01735126811754445,\n \"acc_norm\": 0.6206896551724138,\n\
\ \"acc_norm_stderr\": 0.01735126811754445\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.028580341065138296,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.028580341065138296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.027794760105008746,\n\
\ \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.027794760105008746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320196,\n \
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320196\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3539765319426336,\n\
\ \"acc_stderr\": 0.012213504731731637,\n \"acc_norm\": 0.3539765319426336,\n\
\ \"acc_norm_stderr\": 0.012213504731731637\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428188,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428188\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.434640522875817,\n \"acc_stderr\": 0.020054269200726463,\n \
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.020054269200726463\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.031891418324213966,\n\
\ \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.031891418324213966\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.03445789964362749,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.03445789964362749\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488905,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488905\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.38312967353704325,\n\
\ \"mc2_stderr\": 0.01380374942810861\n }\n}\n```"
repo_url: https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:37:31.585910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:37:31.585910.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:37:31.585910.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:37:31.585910.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_37_31.585910
path:
- results_2023-08-17T18:37:31.585910.parquet
- split: latest
path:
- results_2023-08-17T18:37:31.585910.parquet
---
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-v3](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:37:31.585910](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v3/blob/main/results_2023-08-17T18%3A37%3A31.585910.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4560481833979,
"acc_stderr": 0.03526137828710552,
"acc_norm": 0.46001212718662227,
"acc_norm_stderr": 0.035249446836368305,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557989,
"mc2": 0.38312967353704325,
"mc2_stderr": 0.01380374942810861
},
"harness|arc:challenge|25": {
"acc": 0.47525597269624575,
"acc_stderr": 0.014593487694937738,
"acc_norm": 0.5196245733788396,
"acc_norm_stderr": 0.014600132075947087
},
"harness|hellaswag|10": {
"acc": 0.5774746066520613,
"acc_stderr": 0.00492951701150822,
"acc_norm": 0.7669786895040829,
"acc_norm_stderr": 0.004218917037002668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45806451612903226,
"acc_stderr": 0.028343787250540636,
"acc_norm": 0.45806451612903226,
"acc_norm_stderr": 0.028343787250540636
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6735751295336787,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.6735751295336787,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.02512465352588513,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.02512465352588513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6036697247706422,
"acc_stderr": 0.020971469947900532,
"acc_norm": 0.6036697247706422,
"acc_norm_stderr": 0.020971469947900532
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.02876511171804696,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.02876511171804696
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.031900803894732356,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.031900803894732356
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.048979577377811674,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.048979577377811674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.01735126811754445,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.01735126811754445
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.027794760105008746,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.027794760105008746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320196,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320196
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3539765319426336,
"acc_stderr": 0.012213504731731637,
"acc_norm": 0.3539765319426336,
"acc_norm_stderr": 0.012213504731731637
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428188,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428188
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.020054269200726463,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.020054269200726463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.03445789964362749,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.03445789964362749
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479637,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479637
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557989,
"mc2": 0.38312967353704325,
"mc2_stderr": 0.01380374942810861
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.3 | 2023-08-27T12:41:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-33b-gpt4-1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-33b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T17:42:39.017472](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.3/blob/main/results_2023-08-18T17%3A42%3A39.017472.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5912008134990022,\n\
\ \"acc_stderr\": 0.03394360646904317,\n \"acc_norm\": 0.594673519163153,\n\
\ \"acc_norm_stderr\": 0.033921523174799094,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.01618574435514491,\n \"mc2\": 0.45333485575043747,\n\
\ \"mc2_stderr\": 0.01484625790003476\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.01418827771234981,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038085\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6648078072097192,\n\
\ \"acc_stderr\": 0.004710928569985761,\n \"acc_norm\": 0.8509261103365864,\n\
\ \"acc_norm_stderr\": 0.0035543339768972504\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621502,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621502\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.02487081525105709,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02487081525105709\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365904,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365904\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.02502861027671086,\n \
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.02502861027671086\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.03210479051015776,\n \
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.031911001928357954,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.031911001928357954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n\
\ \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n\
\ \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.02753007844711031,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.02753007844711031\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.026730620728004903,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.026730620728004903\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.012759117066518019,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.012759117066518019\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.01948802574552967,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.01948802574552967\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078684,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078684\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.01618574435514491,\n \"mc2\": 0.45333485575043747,\n\
\ \"mc2_stderr\": 0.01484625790003476\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|arc:challenge|25_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hellaswag|10_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:12:32.965020.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T17:42:39.017472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T17:42:39.017472.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:12:32.965020.parquet'
- split: 2023_08_18T17_42_39.017472
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T17:42:39.017472.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T17:42:39.017472.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_12_32.965020
path:
- results_2023-08-18T14:12:32.965020.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T17:42:39.017472](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.3/blob/main/results_2023-08-18T17%3A42%3A39.017472.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5912008134990022,
"acc_stderr": 0.03394360646904317,
"acc_norm": 0.594673519163153,
"acc_norm_stderr": 0.033921523174799094,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514491,
"mc2": 0.45333485575043747,
"mc2_stderr": 0.01484625790003476
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.01418827771234981,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038085
},
"harness|hellaswag|10": {
"acc": 0.6648078072097192,
"acc_stderr": 0.004710928569985761,
"acc_norm": 0.8509261103365864,
"acc_norm_stderr": 0.0035543339768972504
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02487081525105709,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02487081525105709
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365904,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365904
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.031911001928357954,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.031911001928357954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.02753007844711031,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.02753007844711031
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004903,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518019,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518019
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.01948802574552967,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.01948802574552967
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078684,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078684
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514491,
"mc2": 0.45333485575043747,
"mc2_stderr": 0.01484625790003476
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-m2.0 | 2023-08-27T12:41:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-7b-gpt4-m2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-7b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-m2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-m2.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T12:14:57.901258](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-m2.0/blob/main/results_2023-08-18T12%3A14%3A57.901258.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4562591262750464,\n\
\ \"acc_stderr\": 0.03512875286400451,\n \"acc_norm\": 0.4597118458858273,\n\
\ \"acc_norm_stderr\": 0.0351166113473093,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4133702614368062,\n\
\ \"mc2_stderr\": 0.015511009966578495\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4906143344709898,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5051194539249146,\n \"acc_norm_stderr\": 0.01461062489030916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5794662417845051,\n\
\ \"acc_stderr\": 0.004926358564494565,\n \"acc_norm\": 0.76867157936666,\n\
\ \"acc_norm_stderr\": 0.004208200511232452\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.03681229633394319,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.03681229633394319\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362445,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362445\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484865,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484865\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5032258064516129,\n \"acc_stderr\": 0.028443414226438316,\n \"\
acc_norm\": 0.5032258064516129,\n \"acc_norm_stderr\": 0.028443414226438316\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.32019704433497537,\n \"acc_stderr\": 0.0328264938530415,\n \"\
acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.0328264938530415\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187896,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5353535353535354,\n \"acc_stderr\": 0.03553436368828063,\n \"\
acc_norm\": 0.5353535353535354,\n \"acc_norm_stderr\": 0.03553436368828063\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4076923076923077,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.4076923076923077,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6055045871559633,\n\
\ \"acc_stderr\": 0.02095464210858747,\n \"acc_norm\": 0.6055045871559633,\n\
\ \"acc_norm_stderr\": 0.02095464210858747\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353603,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353603\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6413502109704642,\n \"acc_stderr\": 0.031219569445301836,\n \
\ \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.031219569445301836\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n\
\ \"acc_stderr\": 0.031075028526507748,\n \"acc_norm\": 0.6581196581196581,\n\
\ \"acc_norm_stderr\": 0.031075028526507748\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6181353767560664,\n\
\ \"acc_stderr\": 0.017373732736677586,\n \"acc_norm\": 0.6181353767560664,\n\
\ \"acc_norm_stderr\": 0.017373732736677586\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761987,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761987\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5176848874598071,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.5176848874598071,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02782074420373286,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02782074420373286\n },\n\
\ \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35815602836879434,\n\
\ \"acc_stderr\": 0.028602085862759415,\n \"acc_norm\": 0.35815602836879434,\n\
\ \"acc_norm_stderr\": 0.028602085862759415\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.3409387222946545,\n \"acc_stderr\": 0.01210681720306721,\n\
\ \"acc_norm\": 0.3409387222946545,\n \"acc_norm_stderr\": 0.01210681720306721\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\":\
\ 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\"\
: {\n \"acc\": 0.4542483660130719,\n \"acc_stderr\": 0.020142974553795198,\n\
\ \"acc_norm\": 0.4542483660130719,\n \"acc_norm_stderr\": 0.020142974553795198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5870646766169154,\n\
\ \"acc_stderr\": 0.03481520803367348,\n \"acc_norm\": 0.5870646766169154,\n\
\ \"acc_norm_stderr\": 0.03481520803367348\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.4133702614368062,\n\
\ \"mc2_stderr\": 0.015511009966578495\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-m2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|arc:challenge|25_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hellaswag|10_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:14:57.901258.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:14:57.901258.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T12:14:57.901258.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T12:14:57.901258.parquet'
- config_name: results
data_files:
- split: 2023_08_18T12_14_57.901258
path:
- results_2023-08-18T12:14:57.901258.parquet
- split: latest
path:
- results_2023-08-18T12:14:57.901258.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-gpt4-m2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-m2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-7b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-m2.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T12:14:57.901258](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-m2.0/blob/main/results_2023-08-18T12%3A14%3A57.901258.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4562591262750464,
"acc_stderr": 0.03512875286400451,
"acc_norm": 0.4597118458858273,
"acc_norm_stderr": 0.0351166113473093,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4133702614368062,
"mc2_stderr": 0.015511009966578495
},
"harness|arc:challenge|25": {
"acc": 0.4906143344709898,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5051194539249146,
"acc_norm_stderr": 0.01461062489030916
},
"harness|hellaswag|10": {
"acc": 0.5794662417845051,
"acc_stderr": 0.004926358564494565,
"acc_norm": 0.76867157936666,
"acc_norm_stderr": 0.004208200511232452
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.03681229633394319,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.03681229633394319
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.036186648199362445,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.036186648199362445
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484865,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5032258064516129,
"acc_stderr": 0.028443414226438316,
"acc_norm": 0.5032258064516129,
"acc_norm_stderr": 0.028443414226438316
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.0328264938530415,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.0328264938530415
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5353535353535354,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.5353535353535354,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4076923076923077,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.4076923076923077,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6055045871559633,
"acc_stderr": 0.02095464210858747,
"acc_norm": 0.6055045871559633,
"acc_norm_stderr": 0.02095464210858747
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353603,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353603
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6413502109704642,
"acc_stderr": 0.031219569445301836,
"acc_norm": 0.6413502109704642,
"acc_norm_stderr": 0.031219569445301836
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.031075028526507748,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.031075028526507748
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6181353767560664,
"acc_stderr": 0.017373732736677586,
"acc_norm": 0.6181353767560664,
"acc_norm_stderr": 0.017373732736677586
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761987,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761987
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5176848874598071,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.5176848874598071,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5,
"acc_stderr": 0.02782074420373286,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02782074420373286
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759415,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759415
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3409387222946545,
"acc_stderr": 0.01210681720306721,
"acc_norm": 0.3409387222946545,
"acc_norm_stderr": 0.01210681720306721
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4542483660130719,
"acc_stderr": 0.020142974553795198,
"acc_norm": 0.4542483660130719,
"acc_norm_stderr": 0.020142974553795198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5870646766169154,
"acc_stderr": 0.03481520803367348,
"acc_norm": 0.5870646766169154,
"acc_norm_stderr": 0.03481520803367348
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.4133702614368062,
"mc2_stderr": 0.015511009966578495
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-m2.0 | 2023-08-27T12:41:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-gpt4-m2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-13b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-m2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-m2.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T08:32:42.679525](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-m2.0/blob/main/results_2023-08-18T08%3A32%3A42.679525.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5387637831724015,\n\
\ \"acc_stderr\": 0.03452359530334484,\n \"acc_norm\": 0.5428924052435001,\n\
\ \"acc_norm_stderr\": 0.03450434556391571,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.39699790117986267,\n\
\ \"mc2_stderr\": 0.015535957579740245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5469283276450512,\n \"acc_stderr\": 0.014546892052005628,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449698\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6118303126867158,\n\
\ \"acc_stderr\": 0.004863375698153858,\n \"acc_norm\": 0.8101971718781119,\n\
\ \"acc_norm_stderr\": 0.003913435835391443\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.603225806451613,\n\
\ \"acc_stderr\": 0.027831231605767948,\n \"acc_norm\": 0.603225806451613,\n\
\ \"acc_norm_stderr\": 0.027831231605767948\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244442,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244442\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n\
\ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514567,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514567\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7174311926605504,\n\
\ \"acc_stderr\": 0.019304243497707152,\n \"acc_norm\": 0.7174311926605504,\n\
\ \"acc_norm_stderr\": 0.019304243497707152\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n\
\ \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395592,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929187,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929187\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.037149084099355745,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.037149084099355745\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483713,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483713\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n\
\ \"acc_stderr\": 0.01622501794477098,\n \"acc_norm\": 0.7100893997445722,\n\
\ \"acc_norm_stderr\": 0.01622501794477098\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.0158394004062125,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.0158394004062125\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.02720111766692565,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.02720111766692565\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994099,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994099\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39308996088657105,\n\
\ \"acc_stderr\": 0.01247489961387396,\n \"acc_norm\": 0.39308996088657105,\n\
\ \"acc_norm_stderr\": 0.01247489961387396\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181357,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181357\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894843,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894843\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.39699790117986267,\n\
\ \"mc2_stderr\": 0.015535957579740245\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-m2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|arc:challenge|25_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hellaswag|10_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:32:42.679525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T08:32:42.679525.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T08:32:42.679525.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T08:32:42.679525.parquet'
- config_name: results
data_files:
- split: 2023_08_18T08_32_42.679525
path:
- results_2023-08-18T08:32:42.679525.parquet
- split: latest
path:
- results_2023-08-18T08:32:42.679525.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-m2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-m2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-m2.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T08:32:42.679525](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-m2.0/blob/main/results_2023-08-18T08%3A32%3A42.679525.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5387637831724015,
"acc_stderr": 0.03452359530334484,
"acc_norm": 0.5428924052435001,
"acc_norm_stderr": 0.03450434556391571,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522512,
"mc2": 0.39699790117986267,
"mc2_stderr": 0.015535957579740245
},
"harness|arc:challenge|25": {
"acc": 0.5469283276450512,
"acc_stderr": 0.014546892052005628,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449698
},
"harness|hellaswag|10": {
"acc": 0.6118303126867158,
"acc_stderr": 0.004863375698153858,
"acc_norm": 0.8101971718781119,
"acc_norm_stderr": 0.003913435835391443
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.603225806451613,
"acc_stderr": 0.027831231605767948,
"acc_norm": 0.603225806451613,
"acc_norm_stderr": 0.027831231605767948
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.025329663163489943,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.025329663163489943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514567,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514567
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7174311926605504,
"acc_stderr": 0.019304243497707152,
"acc_norm": 0.7174311926605504,
"acc_norm_stderr": 0.019304243497707152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929187,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929187
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.037149084099355745,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.037149084099355745
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483713,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483713
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.01622501794477098,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.01622501794477098
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931505,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931505
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.0158394004062125,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.0158394004062125
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630998,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.02720111766692565,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.02720111766692565
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994099,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994099
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39308996088657105,
"acc_stderr": 0.01247489961387396,
"acc_norm": 0.39308996088657105,
"acc_norm_stderr": 0.01247489961387396
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181357,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181357
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.031001209039894843,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.031001209039894843
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522512,
"mc2": 0.39699790117986267,
"mc2_stderr": 0.015535957579740245
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-2.0 | 2023-08-27T12:41:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-65b-gpt4-2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-65b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-65b-gpt4-2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-2.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:40:33.016233](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-2.0/blob/main/results_2023-08-17T18%3A40%3A33.016233.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6327167597355648,\n\
\ \"acc_stderr\": 0.03300659636044851,\n \"acc_norm\": 0.6363584035715474,\n\
\ \"acc_norm_stderr\": 0.03298042481924313,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.491072375114453,\n\
\ \"mc2_stderr\": 0.015091020435469083\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.014077223108470137,\n\
\ \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176536\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6841266679944235,\n\
\ \"acc_stderr\": 0.004639126951051431,\n \"acc_norm\": 0.8665604461262697,\n\
\ \"acc_norm_stderr\": 0.0033935420742276404\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.02402225613030823,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.02402225613030823\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.034767257476490364,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.034767257476490364\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.02777253333421896,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.02777253333421896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335065,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342863,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342863\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.01672268452620015,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.01672268452620015\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.0225355263526927,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.0225355263526927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001512,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001512\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053738,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053738\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\
\ \"acc_stderr\": 0.012753716929101003,\n \"acc_norm\": 0.4745762711864407,\n\
\ \"acc_norm_stderr\": 0.012753716929101003\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378461,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378461\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.491072375114453,\n\
\ \"mc2_stderr\": 0.015091020435469083\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-65b-gpt4-2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:40:33.016233.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:40:33.016233.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:40:33.016233.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:40:33.016233.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_40_33.016233
path:
- results_2023-08-17T18:40:33.016233.parquet
- split: latest
path:
- results_2023-08-17T18:40:33.016233.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-65b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-2.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:40:33.016233](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-2.0/blob/main/results_2023-08-17T18%3A40%3A33.016233.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6327167597355648,
"acc_stderr": 0.03300659636044851,
"acc_norm": 0.6363584035715474,
"acc_norm_stderr": 0.03298042481924313,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.491072375114453,
"mc2_stderr": 0.015091020435469083
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.014077223108470137,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176536
},
"harness|hellaswag|10": {
"acc": 0.6841266679944235,
"acc_stderr": 0.004639126951051431,
"acc_norm": 0.8665604461262697,
"acc_norm_stderr": 0.0033935420742276404
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.02402225613030823,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.02402225613030823
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.034767257476490364,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.034767257476490364
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.02777253333421896,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.02777253333421896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342863,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342863
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.01672268452620015,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.01672268452620015
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.0225355263526927,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.0225355263526927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001512,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001512
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053738,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053738
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101003,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101003
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378461,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378461
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.491072375114453,
"mc2_stderr": 0.015091020435469083
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dell-research-harvard/AmericanStoriesTraining | 2023-09-12T17:48:00.000Z | [
"license:apache-2.0",
"region:us"
] | dell-research-harvard | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0 | 2023-08-27T12:41:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-gpt4-2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-13b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T16:46:20.305842](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0/blob/main/results_2023-08-17T16%3A46%3A20.305842.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5486029946283609,\n\
\ \"acc_stderr\": 0.034448042896790196,\n \"acc_norm\": 0.5526065245703492,\n\
\ \"acc_norm_stderr\": 0.034427462914517405,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.3646576349277634,\n\
\ \"mc2_stderr\": 0.014953020453041688\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526847,\n\
\ \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.014370358632472437\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n\
\ \"acc_stderr\": 0.004819367172685959,\n \"acc_norm\": 0.8282214698267277,\n\
\ \"acc_norm_stderr\": 0.003764169746646175\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848879,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848879\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302844,\n \"\
acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302844\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"\
acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615486,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615486\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371218,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371218\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.01922746887646351,\n \"\
acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.01922746887646351\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598028,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598028\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404032,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404032\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7369093231162197,\n\
\ \"acc_stderr\": 0.01574549716904905,\n \"acc_norm\": 0.7369093231162197,\n\
\ \"acc_norm_stderr\": 0.01574549716904905\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124658,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124658\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n\
\ \"acc_stderr\": 0.015183844307206151,\n \"acc_norm\": 0.2905027932960894,\n\
\ \"acc_norm_stderr\": 0.015183844307206151\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063146,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347813,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925654,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994099,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994099\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.012643004623790208,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.012643004623790208\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.0302906191804857,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.0302906191804857\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.3646576349277634,\n\
\ \"mc2_stderr\": 0.014953020453041688\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|arc:challenge|25_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hellaswag|10_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:46:20.305842.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T16:46:20.305842.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T16:46:20.305842.parquet'
- config_name: results
data_files:
- split: 2023_08_17T16_46_20.305842
path:
- results_2023-08-17T16:46:20.305842.parquet
- split: latest
path:
- results_2023-08-17T16:46:20.305842.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T16:46:20.305842](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-2.0/blob/main/results_2023-08-17T16%3A46%3A20.305842.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5486029946283609,
"acc_stderr": 0.034448042896790196,
"acc_norm": 0.5526065245703492,
"acc_norm_stderr": 0.034427462914517405,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.3646576349277634,
"mc2_stderr": 0.014953020453041688
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526847,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.014370358632472437
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685959,
"acc_norm": 0.8282214698267277,
"acc_norm_stderr": 0.003764169746646175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848879,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848879
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036810508691615486,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036810508691615486
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371218,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371218
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.01922746887646351,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.01922746887646351
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404032,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404032
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7369093231162197,
"acc_stderr": 0.01574549716904905,
"acc_norm": 0.7369093231162197,
"acc_norm_stderr": 0.01574549716904905
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124658,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124658
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.015183844307206151,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.015183844307206151
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063146,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347813,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925654,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994099,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994099
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790208,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790208
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.0302906191804857,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.0302906191804857
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.3646576349277634,
"mc2_stderr": 0.014953020453041688
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4 | 2023-08-27T12:41:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-13b-gpt4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-13b-gpt4](https://huggingface.co/jondurbin/airoboros-13b-gpt4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T14:07:58.585031](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4/blob/main/results_2023-08-18T14%3A07%3A58.585031.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4830897578017286,\n\
\ \"acc_stderr\": 0.03489926540538226,\n \"acc_norm\": 0.48680328691224617,\n\
\ \"acc_norm_stderr\": 0.03487903384715786,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.47650415239645905,\n\
\ \"mc2_stderr\": 0.015250263669871788\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.0144813762245589,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6411073491336388,\n\
\ \"acc_stderr\": 0.004786953146657062,\n \"acc_norm\": 0.8329018123879706,\n\
\ \"acc_norm_stderr\": 0.003723010745878389\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.03077265364207567,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.03077265364207567\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707546,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707546\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5451612903225806,\n \"acc_stderr\": 0.02832774309156106,\n \"\
acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.02832774309156106\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n \"\
acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5353535353535354,\n \"acc_stderr\": 0.035534363688280626,\n \"\
acc_norm\": 0.5353535353535354,\n \"acc_norm_stderr\": 0.035534363688280626\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.034356961683613546,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.034356961683613546\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46923076923076923,\n \"acc_stderr\": 0.02530295889085015,\n\
\ \"acc_norm\": 0.46923076923076923,\n \"acc_norm_stderr\": 0.02530295889085015\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766114,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829125,\n \"\
acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829125\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.02904133351059804,\n\
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334383,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334383\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833587,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\
\ \"acc_stderr\": 0.02876034895652341,\n \"acc_norm\": 0.7393162393162394,\n\
\ \"acc_norm_stderr\": 0.02876034895652341\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n\
\ \"acc_stderr\": 0.01668889331080376,\n \"acc_norm\": 0.6794380587484036,\n\
\ \"acc_norm_stderr\": 0.01668889331080376\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282532,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282532\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761985,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761985\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.028555827516528784,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.028555827516528784\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n\
\ \"acc_stderr\": 0.028290869054197608,\n \"acc_norm\": 0.5434083601286174,\n\
\ \"acc_norm_stderr\": 0.028290869054197608\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3963494132985658,\n\
\ \"acc_stderr\": 0.012492830452095222,\n \"acc_norm\": 0.3963494132985658,\n\
\ \"acc_norm_stderr\": 0.012492830452095222\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5228758169934641,\n \"acc_stderr\": 0.020206653187884786,\n \
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.020206653187884786\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.03199615232806286,\n\
\ \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.03199615232806286\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\
\ \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.6368159203980099,\n\
\ \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708312,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708312\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.47650415239645905,\n\
\ \"mc2_stderr\": 0.015250263669871788\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-13b-gpt4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:07:58.585031.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- results_2023-08-18T14:07:58.585031.parquet
- split: latest
path:
- results_2023-08-18T14:07:58.585031.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4](https://huggingface.co/jondurbin/airoboros-13b-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T14:07:58.585031](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4/blob/main/results_2023-08-18T14%3A07%3A58.585031.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4830897578017286,
"acc_stderr": 0.03489926540538226,
"acc_norm": 0.48680328691224617,
"acc_norm_stderr": 0.03487903384715786,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.47650415239645905,
"mc2_stderr": 0.015250263669871788
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.0144813762245589,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.6411073491336388,
"acc_stderr": 0.004786953146657062,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.003723010745878389
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.03077265364207567,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.03077265364207567
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707546,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707546
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.02832774309156106,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.02832774309156106
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5353535353535354,
"acc_stderr": 0.035534363688280626,
"acc_norm": 0.5353535353535354,
"acc_norm_stderr": 0.035534363688280626
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46923076923076923,
"acc_stderr": 0.02530295889085015,
"acc_norm": 0.46923076923076923,
"acc_norm_stderr": 0.02530295889085015
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766114,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.020504729013829125,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.020504729013829125
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334383,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334383
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833587,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833587
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7393162393162394,
"acc_stderr": 0.02876034895652341,
"acc_norm": 0.7393162393162394,
"acc_norm_stderr": 0.02876034895652341
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6794380587484036,
"acc_stderr": 0.01668889331080376,
"acc_norm": 0.6794380587484036,
"acc_norm_stderr": 0.01668889331080376
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.026907849856282532,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.026907849856282532
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761985,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761985
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.028555827516528784,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.028555827516528784
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.028290869054197608,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.028290869054197608
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3963494132985658,
"acc_stderr": 0.012492830452095222,
"acc_norm": 0.3963494132985658,
"acc_norm_stderr": 0.012492830452095222
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.020206653187884786,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.020206653187884786
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5142857142857142,
"acc_stderr": 0.03199615232806286,
"acc_norm": 0.5142857142857142,
"acc_norm_stderr": 0.03199615232806286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.03400598505599014,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.03400598505599014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.47650415239645905,
"mc2_stderr": 0.015250263669871788
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16 | 2023-08-27T12:41:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/robin-65b-v2-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/robin-65b-v2-fp16](https://huggingface.co/TheBloke/robin-65b-v2-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T22:09:59.169977](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16/blob/main/results_2023-08-17T22%3A09%3A59.169977.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6247081012105746,\n\
\ \"acc_stderr\": 0.03306437561725338,\n \"acc_norm\": 0.6287581990313466,\n\
\ \"acc_norm_stderr\": 0.033040816221322156,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5230660045885717,\n\
\ \"mc2_stderr\": 0.014819358026329301\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349808\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n\
\ \"acc_stderr\": 0.004791601975612765,\n \"acc_norm\": 0.8460466042620992,\n\
\ \"acc_norm_stderr\": 0.0036016648387189004\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798325,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798325\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.031907012423268113,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.031907012423268113\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727062,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727062\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"\
acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135374,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135374\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.01370264371536898,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.01370264371536898\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532337,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.027184498909941613,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.027184498909941613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5230660045885717,\n\
\ \"mc2_stderr\": 0.014819358026329301\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/robin-65b-v2-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:09:59.169977.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:09:59.169977.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:09:59.169977.parquet'
- config_name: results
data_files:
- split: 2023_08_17T22_09_59.169977
path:
- results_2023-08-17T22:09:59.169977.parquet
- split: latest
path:
- results_2023-08-17T22:09:59.169977.parquet
---
# Dataset Card for Evaluation run of TheBloke/robin-65b-v2-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/robin-65b-v2-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/robin-65b-v2-fp16](https://huggingface.co/TheBloke/robin-65b-v2-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T22:09:59.169977](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__robin-65b-v2-fp16/blob/main/results_2023-08-17T22%3A09%3A59.169977.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6247081012105746,
"acc_stderr": 0.03306437561725338,
"acc_norm": 0.6287581990313466,
"acc_norm_stderr": 0.033040816221322156,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5230660045885717,
"mc2_stderr": 0.014819358026329301
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349808
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.004791601975612765,
"acc_norm": 0.8460466042620992,
"acc_norm_stderr": 0.0036016648387189004
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798325,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798325
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727062,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727062
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135374,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135374
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266857,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990945,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990945
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.01370264371536898,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.01370264371536898
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5230660045885717,
"mc2_stderr": 0.014819358026329301
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_timdettmers__guanaco-65b-merged | 2023-08-27T12:41:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of timdettmers/guanaco-65b-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [timdettmers/guanaco-65b-merged](https://huggingface.co/timdettmers/guanaco-65b-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_timdettmers__guanaco-65b-merged\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T00:17:34.582006](https://huggingface.co/datasets/open-llm-leaderboard/details_timdettmers__guanaco-65b-merged/blob/main/results_2023-08-18T00%3A17%3A34.582006.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25104389504062485,\n\
\ \"acc_stderr\": 0.030647487837110618,\n \"acc_norm\": 0.2523346329049775,\n\
\ \"acc_norm_stderr\": 0.030669736900925226,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752346,\n \"mc2\": 0.4840947451540454,\n\
\ \"mc2_stderr\": 0.016324348732205056\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n\
\ \"acc_norm\": 0.27474402730375425,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2615016928898626,\n\
\ \"acc_stderr\": 0.004385544487143912,\n \"acc_norm\": 0.26598287193786097,\n\
\ \"acc_norm_stderr\": 0.004409521343140112\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n\
\ \"acc_stderr\": 0.011311347690633881,\n \"acc_norm\": 0.26792698826597133,\n\
\ \"acc_norm_stderr\": 0.011311347690633881\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752346,\n\
\ \"mc2\": 0.4840947451540454,\n \"mc2_stderr\": 0.016324348732205056\n\
\ }\n}\n```"
repo_url: https://huggingface.co/timdettmers/guanaco-65b-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:17:34.582006.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:17:34.582006.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:17:34.582006.parquet'
- config_name: results
data_files:
- split: 2023_08_18T00_17_34.582006
path:
- results_2023-08-18T00:17:34.582006.parquet
- split: latest
path:
- results_2023-08-18T00:17:34.582006.parquet
---
# Dataset Card for Evaluation run of timdettmers/guanaco-65b-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/timdettmers/guanaco-65b-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [timdettmers/guanaco-65b-merged](https://huggingface.co/timdettmers/guanaco-65b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_timdettmers__guanaco-65b-merged",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T00:17:34.582006](https://huggingface.co/datasets/open-llm-leaderboard/details_timdettmers__guanaco-65b-merged/blob/main/results_2023-08-18T00%3A17%3A34.582006.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25104389504062485,
"acc_stderr": 0.030647487837110618,
"acc_norm": 0.2523346329049775,
"acc_norm_stderr": 0.030669736900925226,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752346,
"mc2": 0.4840947451540454,
"mc2_stderr": 0.016324348732205056
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.27474402730375425,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.2615016928898626,
"acc_stderr": 0.004385544487143912,
"acc_norm": 0.26598287193786097,
"acc_norm_stderr": 0.004409521343140112
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891148,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891148
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26792698826597133,
"acc_stderr": 0.011311347690633881,
"acc_norm": 0.26792698826597133,
"acc_norm_stderr": 0.011311347690633881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752346,
"mc2": 0.4840947451540454,
"mc2_stderr": 0.016324348732205056
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf | 2023-09-22T23:15:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T23:15:18.463104](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf/blob/main/results_2023-09-22T23-15-18.463104.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.031774328859060404,\n\
\ \"em_stderr\": 0.0017962473521312393,\n \"f1\": 0.08420092281879202,\n\
\ \"f1_stderr\": 0.0021474530604162255,\n \"acc\": 0.3646366953032391,\n\
\ \"acc_stderr\": 0.00915095624646051\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.031774328859060404,\n \"em_stderr\": 0.0017962473521312393,\n\
\ \"f1\": 0.08420092281879202,\n \"f1_stderr\": 0.0021474530604162255\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03866565579984837,\n \
\ \"acc_stderr\": 0.005310583162098024\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6906077348066298,\n \"acc_stderr\": 0.012991329330822995\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|arc:challenge|25_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T23_15_18.463104
path:
- '**/details_harness|drop|3_2023-09-22T23-15-18.463104.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T23-15-18.463104.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T23_15_18.463104
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-15-18.463104.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-15-18.463104.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hellaswag|10_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T23_15_18.463104
path:
- '**/details_harness|winogrande|5_2023-09-22T23-15-18.463104.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T23-15-18.463104.parquet'
- config_name: results
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- results_2023-08-18T12:43:45.904593.parquet
- split: 2023_09_22T23_15_18.463104
path:
- results_2023-09-22T23-15-18.463104.parquet
- split: latest
path:
- results_2023-09-22T23-15-18.463104.parquet
---
# Dataset Card for Evaluation run of openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T23:15:18.463104](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf/blob/main/results_2023-09-22T23-15-18.463104.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.031774328859060404,
"em_stderr": 0.0017962473521312393,
"f1": 0.08420092281879202,
"f1_stderr": 0.0021474530604162255,
"acc": 0.3646366953032391,
"acc_stderr": 0.00915095624646051
},
"harness|drop|3": {
"em": 0.031774328859060404,
"em_stderr": 0.0017962473521312393,
"f1": 0.08420092281879202,
"f1_stderr": 0.0021474530604162255
},
"harness|gsm8k|5": {
"acc": 0.03866565579984837,
"acc_stderr": 0.005310583162098024
},
"harness|winogrande|5": {
"acc": 0.6906077348066298,
"acc_stderr": 0.012991329330822995
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M | 2023-08-27T12:41:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of BreadAi/gpt-YA-1-1_70M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BreadAi/gpt-YA-1-1_70M](https://huggingface.co/BreadAi/gpt-YA-1-1_70M) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:44:57.081356](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M/blob/main/results_2023-08-17T18%3A44%3A57.081356.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25263513213174,\n\
\ \"acc_stderr\": 0.031384190641892996,\n \"acc_norm\": 0.25361158577854925,\n\
\ \"acc_norm_stderr\": 0.03140357808961037,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.47091024738309234,\n\
\ \"mc2_stderr\": 0.015567301766466582\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.17491467576791808,\n \"acc_stderr\": 0.011101562501828225,\n\
\ \"acc_norm\": 0.22525597269624573,\n \"acc_norm_stderr\": 0.012207839995407305\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26638119896434975,\n\
\ \"acc_stderr\": 0.004411624374176701,\n \"acc_norm\": 0.27365066719776937,\n\
\ \"acc_norm_stderr\": 0.004449206295922396\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.03861229196653697,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.03861229196653697\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421255,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421255\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514206,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514206\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102147,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102147\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198913,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198913\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.03308818594415751,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.03308818594415751\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30512820512820515,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.30512820512820515,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27339449541284405,\n \"acc_stderr\": 0.01910929984609828,\n \"\
acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.01910929984609828\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n\
\ \"acc_stderr\": 0.02977177522814563,\n \"acc_norm\": 0.23529411764705882,\n\
\ \"acc_norm_stderr\": 0.02977177522814563\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.0283046579430353,\n\
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615767,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615767\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18376068376068377,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.18376068376068377,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27586206896551724,\n\
\ \"acc_stderr\": 0.015982814774695625,\n \"acc_norm\": 0.27586206896551724,\n\
\ \"acc_norm_stderr\": 0.015982814774695625\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855716,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.02282731749105968,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.02282731749105968\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.02866199620233531,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.02866199620233531\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132227,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\
\ \"acc_stderr\": 0.036942843353377997,\n \"acc_norm\": 0.18181818181818182,\n\
\ \"acc_norm_stderr\": 0.036942843353377997\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2938775510204082,\n \"acc_stderr\": 0.02916273841024975,\n\
\ \"acc_norm\": 0.2938775510204082,\n \"acc_norm_stderr\": 0.02916273841024975\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1686746987951807,\n\
\ \"acc_stderr\": 0.029152009627856544,\n \"acc_norm\": 0.1686746987951807,\n\
\ \"acc_norm_stderr\": 0.029152009627856544\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.47091024738309234,\n\
\ \"mc2_stderr\": 0.015567301766466582\n }\n}\n```"
repo_url: https://huggingface.co/BreadAi/gpt-YA-1-1_70M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:44:57.081356.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:44:57.081356.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:44:57.081356.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_44_57.081356
path:
- results_2023-08-17T18:44:57.081356.parquet
- split: latest
path:
- results_2023-08-17T18:44:57.081356.parquet
---
# Dataset Card for Evaluation run of BreadAi/gpt-YA-1-1_70M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BreadAi/gpt-YA-1-1_70M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BreadAi/gpt-YA-1-1_70M](https://huggingface.co/BreadAi/gpt-YA-1-1_70M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:44:57.081356](https://huggingface.co/datasets/open-llm-leaderboard/details_BreadAi__gpt-YA-1-1_70M/blob/main/results_2023-08-17T18%3A44%3A57.081356.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25263513213174,
"acc_stderr": 0.031384190641892996,
"acc_norm": 0.25361158577854925,
"acc_norm_stderr": 0.03140357808961037,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.47091024738309234,
"mc2_stderr": 0.015567301766466582
},
"harness|arc:challenge|25": {
"acc": 0.17491467576791808,
"acc_stderr": 0.011101562501828225,
"acc_norm": 0.22525597269624573,
"acc_norm_stderr": 0.012207839995407305
},
"harness|hellaswag|10": {
"acc": 0.26638119896434975,
"acc_stderr": 0.004411624374176701,
"acc_norm": 0.27365066719776937,
"acc_norm_stderr": 0.004449206295922396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421255,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421255
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.028504856470514206,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.028504856470514206
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102147,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102147
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.3,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198913,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198913
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.03308818594415751,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.03308818594415751
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30512820512820515,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.30512820512820515,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371216,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.01910929984609828,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.01910929984609828
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615767,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615767
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18376068376068377,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.18376068376068377,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.015982814774695625,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.015982814774695625
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855716,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.02282731749105968,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.02282731749105968
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.02866199620233531,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.02866199620233531
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132227,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353377997,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353377997
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2938775510204082,
"acc_stderr": 0.02916273841024975,
"acc_norm": 0.2938775510204082,
"acc_norm_stderr": 0.02916273841024975
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1686746987951807,
"acc_stderr": 0.029152009627856544,
"acc_norm": 0.1686746987951807,
"acc_norm_stderr": 0.029152009627856544
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.47091024738309234,
"mc2_stderr": 0.015567301766466582
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3 | 2023-08-27T12:41:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2-13B-IA3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2-13B-IA3](https://huggingface.co/yeontaek/Platypus2-13B-IA3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T01:02:21.186475](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3/blob/main/results_2023-08-18T01%3A02%3A21.186475.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5642555800956389,\n\
\ \"acc_stderr\": 0.034228278563876655,\n \"acc_norm\": 0.568519190834164,\n\
\ \"acc_norm_stderr\": 0.03420609279919772,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476197,\n \"mc2\": 0.3834895821328264,\n\
\ \"mc2_stderr\": 0.014176650308316793\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804243,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045605\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6261700856403107,\n\
\ \"acc_stderr\": 0.004828305041904403,\n \"acc_norm\": 0.8265285799641505,\n\
\ \"acc_norm_stderr\": 0.003778804474605914\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.02418049716437691,\n \"acc_norm\"\
: 0.328042328042328,\n \"acc_norm_stderr\": 0.02418049716437691\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n\
\ \"acc_stderr\": 0.02582210611941591,\n \"acc_norm\": 0.7096774193548387,\n\
\ \"acc_norm_stderr\": 0.02582210611941591\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.02533466708095492,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.02533466708095492\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.01890416417151019,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.01890416417151019\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n\
\ \"acc_stderr\": 0.02933116229425173,\n \"acc_norm\": 0.7745098039215687,\n\
\ \"acc_norm_stderr\": 0.02933116229425173\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n\
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.0368035037128646,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.0368035037128646\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734195,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734195\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n\
\ \"acc_stderr\": 0.015801003729145897,\n \"acc_norm\": 0.33631284916201115,\n\
\ \"acc_norm_stderr\": 0.015801003729145897\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510467998,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510467998\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011628,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011628\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.020095083154577347,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.020095083154577347\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476197,\n \"mc2\": 0.3834895821328264,\n\
\ \"mc2_stderr\": 0.014176650308316793\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2-13B-IA3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:02:21.186475.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:02:21.186475.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:02:21.186475.parquet'
- config_name: results
data_files:
- split: 2023_08_18T01_02_21.186475
path:
- results_2023-08-18T01:02:21.186475.parquet
- split: latest
path:
- results_2023-08-18T01:02:21.186475.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-IA3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-IA3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-IA3](https://huggingface.co/yeontaek/Platypus2-13B-IA3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T01:02:21.186475](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-IA3/blob/main/results_2023-08-18T01%3A02%3A21.186475.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5642555800956389,
"acc_stderr": 0.034228278563876655,
"acc_norm": 0.568519190834164,
"acc_norm_stderr": 0.03420609279919772,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476197,
"mc2": 0.3834895821328264,
"mc2_stderr": 0.014176650308316793
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804243,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045605
},
"harness|hellaswag|10": {
"acc": 0.6261700856403107,
"acc_stderr": 0.004828305041904403,
"acc_norm": 0.8265285799641505,
"acc_norm_stderr": 0.003778804474605914
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437691,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437691
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.02582210611941591,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.02582210611941591
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.02533466708095492,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.02533466708095492
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.01890416417151019,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.01890416417151019
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425173,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425173
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.0368035037128646,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.0368035037128646
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734195,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734195
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33631284916201115,
"acc_stderr": 0.015801003729145897,
"acc_norm": 0.33631284916201115,
"acc_norm_stderr": 0.015801003729145897
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510467998,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510467998
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011628,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011628
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.020095083154577347,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.020095083154577347
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476197,
"mc2": 0.3834895821328264,
"mc2_stderr": 0.014176650308316793
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco | 2023-08-27T12:41:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70b-IA3-guanaco
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70b-IA3-guanaco](https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T03:44:14.521953](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco/blob/main/results_2023-08-18T03%3A44%3A14.521953.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6697579806120741,\n\
\ \"acc_stderr\": 0.031724667502401015,\n \"acc_norm\": 0.673711143625791,\n\
\ \"acc_norm_stderr\": 0.031697718177589644,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.43472046491596456,\n\
\ \"mc2_stderr\": 0.014163021655310653\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.013921008595179352,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6567416849233221,\n\
\ \"acc_stderr\": 0.004738264944737148,\n \"acc_norm\": 0.8567018522206732,\n\
\ \"acc_norm_stderr\": 0.003496605672960698\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.031489558297455304,\n\
\ \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.031489558297455304\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172534,\n\
\ \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172534\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645365,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645365\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372177,\n\
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8605504587155963,\n\
\ \"acc_stderr\": 0.014852421490033055,\n \"acc_norm\": 0.8605504587155963,\n\
\ \"acc_norm_stderr\": 0.014852421490033055\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8921568627450981,\n \"acc_stderr\": 0.021770522281368394,\n \"\
acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.021770522281368394\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n\
\ \"acc_stderr\": 0.030069584874494036,\n \"acc_norm\": 0.7219730941704036,\n\
\ \"acc_norm_stderr\": 0.030069584874494036\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.02826881219254063,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.02826881219254063\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8416347381864623,\n\
\ \"acc_stderr\": 0.013055346753516725,\n \"acc_norm\": 0.8416347381864623,\n\
\ \"acc_norm_stderr\": 0.013055346753516725\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.01624202883405362,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.01624202883405362\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982476,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982476\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\
\ \"acc_stderr\": 0.024406162094668886,\n \"acc_norm\": 0.7556270096463023,\n\
\ \"acc_norm_stderr\": 0.024406162094668886\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5176010430247718,\n\
\ \"acc_stderr\": 0.01276232129882365,\n \"acc_norm\": 0.5176010430247718,\n\
\ \"acc_norm_stderr\": 0.01276232129882365\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7205882352941176,\n \"acc_stderr\": 0.018152871051538812,\n \
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.018152871051538812\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900794,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900794\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.43472046491596456,\n\
\ \"mc2_stderr\": 0.014163021655310653\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|arc:challenge|25_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hellaswag|10_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:44:14.521953.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T03:44:14.521953.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T03:44:14.521953.parquet'
- config_name: results
data_files:
- split: 2023_08_18T03_44_14.521953
path:
- results_2023-08-18T03:44:14.521953.parquet
- split: latest
path:
- results_2023-08-18T03:44:14.521953.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70b-IA3-guanaco
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70b-IA3-guanaco](https://huggingface.co/yeontaek/llama-2-70b-IA3-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T03:44:14.521953](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70b-IA3-guanaco/blob/main/results_2023-08-18T03%3A44%3A14.521953.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6697579806120741,
"acc_stderr": 0.031724667502401015,
"acc_norm": 0.673711143625791,
"acc_norm_stderr": 0.031697718177589644,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.43472046491596456,
"mc2_stderr": 0.014163021655310653
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.013921008595179352,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6567416849233221,
"acc_stderr": 0.004738264944737148,
"acc_norm": 0.8567018522206732,
"acc_norm_stderr": 0.003496605672960698
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645365,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645365
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372177,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8605504587155963,
"acc_stderr": 0.014852421490033055,
"acc_norm": 0.8605504587155963,
"acc_norm_stderr": 0.014852421490033055
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.021770522281368394,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.021770522281368394
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494036,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494036
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.02826881219254063,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.02826881219254063
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8416347381864623,
"acc_stderr": 0.013055346753516725,
"acc_norm": 0.8416347381864623,
"acc_norm_stderr": 0.013055346753516725
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.01624202883405362,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.01624202883405362
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982476,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982476
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.024406162094668886,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.024406162094668886
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7716049382716049,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.7716049382716049,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5176010430247718,
"acc_stderr": 0.01276232129882365,
"acc_norm": 0.5176010430247718,
"acc_norm_stderr": 0.01276232129882365
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.018152871051538812,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.018152871051538812
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900794,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900794
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.43472046491596456,
"mc2_stderr": 0.014163021655310653
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3 | 2023-08-27T12:41:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2xOpenOrca-13B-IA3](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T07:56:15.654577](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3/blob/main/results_2023-08-18T07%3A56%3A15.654577.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5888553508763191,\n\
\ \"acc_stderr\": 0.03399387819981227,\n \"acc_norm\": 0.5929309965128265,\n\
\ \"acc_norm_stderr\": 0.03397280622807554,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.4787773325405089,\n\
\ \"mc2_stderr\": 0.015195311125902358\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000328\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.614618601872137,\n\
\ \"acc_stderr\": 0.004856906473719379,\n \"acc_norm\": 0.8209520015933081,\n\
\ \"acc_norm_stderr\": 0.0038260895866500527\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286634,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286634\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028417,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7064516129032258,\n \"acc_stderr\": 0.0259060870213193,\n \"acc_norm\"\
: 0.7064516129032258,\n \"acc_norm_stderr\": 0.0259060870213193\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n\
\ \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.4433497536945813,\n\
\ \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.03680918141673881,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.03680918141673881\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560417,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560417\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7573435504469987,\n\
\ \"acc_stderr\": 0.01532988894089986,\n \"acc_norm\": 0.7573435504469987,\n\
\ \"acc_norm_stderr\": 0.01532988894089986\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277902,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277902\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47039106145251397,\n\
\ \"acc_stderr\": 0.016693154927383574,\n \"acc_norm\": 0.47039106145251397,\n\
\ \"acc_norm_stderr\": 0.016693154927383574\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363947,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363947\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732852,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732852\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786696,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786696\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.4787773325405089,\n\
\ \"mc2_stderr\": 0.015195311125902358\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:56:15.654577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:56:15.654577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:56:15.654577.parquet'
- config_name: results
data_files:
- split: 2023_08_18T07_56_15.654577
path:
- results_2023-08-18T07:56:15.654577.parquet
- split: latest
path:
- results_2023-08-18T07:56:15.654577.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-IA3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-IA3](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-IA3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T07:56:15.654577](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-IA3/blob/main/results_2023-08-18T07%3A56%3A15.654577.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5888553508763191,
"acc_stderr": 0.03399387819981227,
"acc_norm": 0.5929309965128265,
"acc_norm_stderr": 0.03397280622807554,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.4787773325405089,
"mc2_stderr": 0.015195311125902358
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000328
},
"harness|hellaswag|10": {
"acc": 0.614618601872137,
"acc_stderr": 0.004856906473719379,
"acc_norm": 0.8209520015933081,
"acc_norm_stderr": 0.0038260895866500527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286634,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286634
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028417,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.0259060870213193,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.0259060870213193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.03680918141673881,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.03680918141673881
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560417,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560417
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7573435504469987,
"acc_stderr": 0.01532988894089986,
"acc_norm": 0.7573435504469987,
"acc_norm_stderr": 0.01532988894089986
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277902,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277902
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47039106145251397,
"acc_stderr": 0.016693154927383574,
"acc_norm": 0.47039106145251397,
"acc_norm_stderr": 0.016693154927383574
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363947,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363947
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732852,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732852
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379778,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379778
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786696,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786696
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.4787773325405089,
"mc2_stderr": 0.015195311125902358
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa | 2023-09-18T02:18:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2-13B-LoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2-13B-LoRa](https://huggingface.co/yeontaek/Platypus2-13B-LoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T02:18:05.535474](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa/blob/main/results_2023-09-18T02-18-05.535474.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029991610738255032,\n\
\ \"em_stderr\": 0.0017467360834755531,\n \"f1\": 0.09242449664429517,\n\
\ \"f1_stderr\": 0.0021342871324921244,\n \"acc\": 0.417165368277252,\n\
\ \"acc_stderr\": 0.009636596178855414\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.029991610738255032,\n \"em_stderr\": 0.0017467360834755531,\n\
\ \"f1\": 0.09242449664429517,\n \"f1_stderr\": 0.0021342871324921244\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \
\ \"acc_stderr\": 0.007257633145486642\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224185\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2-13B-LoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T02_18_05.535474
path:
- '**/details_harness|drop|3_2023-09-18T02-18-05.535474.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T02-18-05.535474.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T02_18_05.535474
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-18-05.535474.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-18-05.535474.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:15:43.856388.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:15:43.856388.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T02_18_05.535474
path:
- '**/details_harness|winogrande|5_2023-09-18T02-18-05.535474.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T02-18-05.535474.parquet'
- config_name: results
data_files:
- split: 2023_08_18T01_15_43.856388
path:
- results_2023-08-18T01:15:43.856388.parquet
- split: 2023_09_18T02_18_05.535474
path:
- results_2023-09-18T02-18-05.535474.parquet
- split: latest
path:
- results_2023-09-18T02-18-05.535474.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-LoRa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-LoRa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-LoRa](https://huggingface.co/yeontaek/Platypus2-13B-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T02:18:05.535474](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa/blob/main/results_2023-09-18T02-18-05.535474.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.029991610738255032,
"em_stderr": 0.0017467360834755531,
"f1": 0.09242449664429517,
"f1_stderr": 0.0021342871324921244,
"acc": 0.417165368277252,
"acc_stderr": 0.009636596178855414
},
"harness|drop|3": {
"em": 0.029991610738255032,
"em_stderr": 0.0017467360834755531,
"f1": 0.09242449664429517,
"f1_stderr": 0.0021342871324921244
},
"harness|gsm8k|5": {
"acc": 0.07505686125852919,
"acc_stderr": 0.007257633145486642
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa | 2023-08-27T12:41:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2xOpenOrca-13B-LoRa](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T14:49:25.189557](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa/blob/main/results_2023-08-18T14%3A49%3A25.189557.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5876448107959835,\n\
\ \"acc_stderr\": 0.034030456250524845,\n \"acc_norm\": 0.5919535320068932,\n\
\ \"acc_norm_stderr\": 0.03400897171284171,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.4515078409715004,\n\
\ \"mc2_stderr\": 0.015012527204943192\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804243,\n\
\ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670726\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6144194383588927,\n\
\ \"acc_stderr\": 0.004857374133246887,\n \"acc_norm\": 0.8208524198366859,\n\
\ \"acc_norm_stderr\": 0.0038269212990754017\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934266,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727062,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727062\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.0259060870213193,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.0259060870213193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964684,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964684\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059274,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059274\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.02441494730454367,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.02441494730454367\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.015133383278988829,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.015133383278988829\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765407,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765407\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186805,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186805\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.01274520462608314,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.01274520462608314\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.03010563657001663,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.03010563657001663\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6094771241830066,\n \"acc_stderr\": 0.019737008998094597,\n \
\ \"acc_norm\": 0.6094771241830066,\n \"acc_norm_stderr\": 0.019737008998094597\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.4515078409715004,\n\
\ \"mc2_stderr\": 0.015012527204943192\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:49:25.189557.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:49:25.189557.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:49:25.189557.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_49_25.189557
path:
- results_2023-08-18T14:49:25.189557.parquet
- split: latest
path:
- results_2023-08-18T14:49:25.189557.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2xOpenOrca-13B-LoRa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2xOpenOrca-13B-LoRa](https://huggingface.co/yeontaek/Platypus2xOpenOrca-13B-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T14:49:25.189557](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2xOpenOrca-13B-LoRa/blob/main/results_2023-08-18T14%3A49%3A25.189557.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5876448107959835,
"acc_stderr": 0.034030456250524845,
"acc_norm": 0.5919535320068932,
"acc_norm_stderr": 0.03400897171284171,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.4515078409715004,
"mc2_stderr": 0.015012527204943192
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804243,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670726
},
"harness|hellaswag|10": {
"acc": 0.6144194383588927,
"acc_stderr": 0.004857374133246887,
"acc_norm": 0.8208524198366859,
"acc_norm_stderr": 0.0038269212990754017
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934266,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727062,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727062
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.0259060870213193,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.0259060870213193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964684,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964684
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059274,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059274
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02441494730454367,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02441494730454367
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988829,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765407,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765407
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186805,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186805
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.01274520462608314,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.01274520462608314
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.03010563657001663,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.03010563657001663
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6094771241830066,
"acc_stderr": 0.019737008998094597,
"acc_norm": 0.6094771241830066,
"acc_norm_stderr": 0.019737008998094597
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.4515078409715004,
"mc2_stderr": 0.015012527204943192
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa | 2023-08-27T12:41:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2-13B-QLoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2-13B-QLoRa](https://huggingface.co/yeontaek/Platypus2-13B-QLoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T03:06:05.909035](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa/blob/main/results_2023-08-18T03%3A06%3A05.909035.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5733901703784499,\n\
\ \"acc_stderr\": 0.03423406867918981,\n \"acc_norm\": 0.5776904524574422,\n\
\ \"acc_norm_stderr\": 0.03421370471407635,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4337883618834812,\n\
\ \"mc2_stderr\": 0.014530897755514057\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985994,\n\
\ \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520769\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6195976897032464,\n\
\ \"acc_stderr\": 0.0048449353275992054,\n \"acc_norm\": 0.8255327623979287,\n\
\ \"acc_norm_stderr\": 0.0037873515193708124\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777474,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777474\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197946,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197946\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899207,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724342,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724342\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n\
\ \"acc_stderr\": 0.014583812465862551,\n \"acc_norm\": 0.789272030651341,\n\
\ \"acc_norm_stderr\": 0.014583812465862551\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510467998,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510467998\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868038,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.03175195237583323,\n\
\ \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.03175195237583323\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.03170056183497309,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.03170056183497309\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4337883618834812,\n\
\ \"mc2_stderr\": 0.014530897755514057\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2-13B-QLoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|arc:challenge|25_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hellaswag|10_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T03:06:05.909035.parquet'
- config_name: results
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- results_2023-08-18T03:06:05.909035.parquet
- split: latest
path:
- results_2023-08-18T03:06:05.909035.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-QLoRa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-QLoRa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-QLoRa](https://huggingface.co/yeontaek/Platypus2-13B-QLoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T03:06:05.909035](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa/blob/main/results_2023-08-18T03%3A06%3A05.909035.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5733901703784499,
"acc_stderr": 0.03423406867918981,
"acc_norm": 0.5776904524574422,
"acc_norm_stderr": 0.03421370471407635,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4337883618834812,
"mc2_stderr": 0.014530897755514057
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985994,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520769
},
"harness|hellaswag|10": {
"acc": 0.6195976897032464,
"acc_stderr": 0.0048449353275992054,
"acc_norm": 0.8255327623979287,
"acc_norm_stderr": 0.0037873515193708124
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777474,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197946,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197946
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899207,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724342,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724342
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849928,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849928
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862551,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862551
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510467998,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510467998
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868038,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.563265306122449,
"acc_stderr": 0.03175195237583323,
"acc_norm": 0.563265306122449,
"acc_norm_stderr": 0.03175195237583323
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.03170056183497309,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.03170056183497309
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.4337883618834812,
"mc2_stderr": 0.014530897755514057
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload | 2023-08-27T12:41:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/trurl-2-13b-pl-instruct_unload](https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T09:28:28.841723](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload/blob/main/results_2023-08-18T09%3A28%3A28.841723.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7795709357261442,\n\
\ \"acc_stderr\": 0.028026888672435916,\n \"acc_norm\": 0.7836129803684724,\n\
\ \"acc_norm_stderr\": 0.028008425887845717,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.01602157061376854,\n \"mc2\": 0.4556260164510823,\n\
\ \"mc2_stderr\": 0.015092321286472773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097664,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6092411870145389,\n\
\ \"acc_stderr\": 0.004869232758103327,\n \"acc_norm\": 0.7999402509460267,\n\
\ \"acc_norm_stderr\": 0.003992272261659569\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7555555555555555,\n\
\ \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.7555555555555555,\n\
\ \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8603773584905661,\n \"acc_stderr\": 0.021331453470148233,\n\
\ \"acc_norm\": 0.8603773584905661,\n \"acc_norm_stderr\": 0.021331453470148233\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.02891980295613489,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.02891980295613489\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.69,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.029957851329869337,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.029957851329869337\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.6274509803921569,\n\
\ \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.6274509803921569,\n\
\ \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7276595744680852,\n\
\ \"acc_stderr\": 0.029101290698386715,\n \"acc_norm\": 0.7276595744680852,\n\
\ \"acc_norm_stderr\": 0.029101290698386715\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04677473004491199,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04677473004491199\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.8068965517241379,\n \"acc_stderr\": 0.03289445522127401,\n \"\
acc_norm\": 0.8068965517241379,\n \"acc_norm_stderr\": 0.03289445522127401\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5767195767195767,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.5767195767195767,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n\
\ \"acc_stderr\": 0.04375888492727059,\n \"acc_norm\": 0.6031746031746031,\n\
\ \"acc_norm_stderr\": 0.04375888492727059\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8580645161290322,\n \"acc_stderr\": 0.019853003676559757,\n \"\
acc_norm\": 0.8580645161290322,\n \"acc_norm_stderr\": 0.019853003676559757\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.7142857142857143,\n \"acc_stderr\": 0.03178529710642749,\n \"\
acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.03178529710642749\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9151515151515152,\n \"acc_stderr\": 0.02175938534083591,\n\
\ \"acc_norm\": 0.9151515151515152,\n \"acc_norm_stderr\": 0.02175938534083591\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199488,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7743589743589744,\n \"acc_stderr\": 0.02119363252514854,\n \
\ \"acc_norm\": 0.7743589743589744,\n \"acc_norm_stderr\": 0.02119363252514854\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.5259259259259259,\n \"acc_stderr\": 0.03044452852881074,\n \
\ \"acc_norm\": 0.5259259259259259,\n \"acc_norm_stderr\": 0.03044452852881074\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673957,\n\
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673957\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5695364238410596,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.5695364238410596,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334884,\n \"\
acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334884\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7037037037037037,\n \"acc_stderr\": 0.031141447823536037,\n \"\
acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.031141447823536037\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9607843137254902,\n \"acc_stderr\": 0.013623692819208815,\n \"\
acc_norm\": 0.9607843137254902,\n \"acc_norm_stderr\": 0.013623692819208815\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8340807174887892,\n\
\ \"acc_stderr\": 0.024967553196547147,\n \"acc_norm\": 0.8340807174887892,\n\
\ \"acc_norm_stderr\": 0.024967553196547147\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n\
\ \"acc_stderr\": 0.0314570385430625,\n \"acc_norm\": 0.8796296296296297,\n\
\ \"acc_norm_stderr\": 0.0314570385430625\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9202453987730062,\n \"acc_stderr\": 0.021284928419899075,\n\
\ \"acc_norm\": 0.9202453987730062,\n \"acc_norm_stderr\": 0.021284928419899075\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n\
\ \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.6607142857142857,\n\
\ \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640404,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640404\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673144,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673144\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n\
\ \"acc_stderr\": 0.010648356301876345,\n \"acc_norm\": 0.9016602809706258,\n\
\ \"acc_norm_stderr\": 0.010648356301876345\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490714,\n\
\ \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490714\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6748603351955307,\n\
\ \"acc_stderr\": 0.015666542785053566,\n \"acc_norm\": 0.6748603351955307,\n\
\ \"acc_norm_stderr\": 0.015666542785053566\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.0227337894054476,\n\
\ \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.0227337894054476\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8488745980707395,\n\
\ \"acc_stderr\": 0.02034274974442863,\n \"acc_norm\": 0.8488745980707395,\n\
\ \"acc_norm_stderr\": 0.02034274974442863\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.01924252622654455,\n\
\ \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.01924252622654455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6631205673758865,\n \"acc_stderr\": 0.028195534873966727,\n \
\ \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.028195534873966727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6857887874837028,\n\
\ \"acc_stderr\": 0.011855911587048224,\n \"acc_norm\": 0.6857887874837028,\n\
\ \"acc_norm_stderr\": 0.011855911587048224\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8713235294117647,\n \"acc_stderr\": 0.020340173153899008,\n\
\ \"acc_norm\": 0.8713235294117647,\n \"acc_norm_stderr\": 0.020340173153899008\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.826797385620915,\n \"acc_stderr\": 0.01530932926696914,\n \
\ \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.01530932926696914\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8909090909090909,\n\
\ \"acc_stderr\": 0.02986054477658413,\n \"acc_norm\": 0.8909090909090909,\n\
\ \"acc_norm_stderr\": 0.02986054477658413\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n\
\ \"acc_stderr\": 0.019675343217199173,\n \"acc_norm\": 0.9154228855721394,\n\
\ \"acc_norm_stderr\": 0.019675343217199173\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594197,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594197\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.8012048192771084,\n\
\ \"acc_stderr\": 0.03106939026078942,\n \"acc_norm\": 0.8012048192771084,\n\
\ \"acc_norm_stderr\": 0.03106939026078942\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9415204678362573,\n \"acc_stderr\": 0.017996678857280124,\n\
\ \"acc_norm\": 0.9415204678362573,\n \"acc_norm_stderr\": 0.017996678857280124\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.01602157061376854,\n \"mc2\": 0.4556260164510823,\n\
\ \"mc2_stderr\": 0.015092321286472773\n }\n}\n```"
repo_url: https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|arc:challenge|25_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hellaswag|10_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T09:28:28.841723.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T09:28:28.841723.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T09:28:28.841723.parquet'
- config_name: results
data_files:
- split: 2023_08_18T09_28_28.841723
path:
- results_2023-08-18T09:28:28.841723.parquet
- split: latest
path:
- results_2023-08-18T09:28:28.841723.parquet
---
# Dataset Card for Evaluation run of Aspik101/trurl-2-13b-pl-instruct_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/trurl-2-13b-pl-instruct_unload](https://huggingface.co/Aspik101/trurl-2-13b-pl-instruct_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T09:28:28.841723](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__trurl-2-13b-pl-instruct_unload/blob/main/results_2023-08-18T09%3A28%3A28.841723.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7795709357261442,
"acc_stderr": 0.028026888672435916,
"acc_norm": 0.7836129803684724,
"acc_norm_stderr": 0.028008425887845717,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.01602157061376854,
"mc2": 0.4556260164510823,
"mc2_stderr": 0.015092321286472773
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097664,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719867
},
"harness|hellaswag|10": {
"acc": 0.6092411870145389,
"acc_stderr": 0.004869232758103327,
"acc_norm": 0.7999402509460267,
"acc_norm_stderr": 0.003992272261659569
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7555555555555555,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.7555555555555555,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8603773584905661,
"acc_stderr": 0.021331453470148233,
"acc_norm": 0.8603773584905661,
"acc_norm_stderr": 0.021331453470148233
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.02891980295613489,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.02891980295613489
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.69,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.029957851329869337,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.029957851329869337
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7276595744680852,
"acc_stderr": 0.029101290698386715,
"acc_norm": 0.7276595744680852,
"acc_norm_stderr": 0.029101290698386715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8068965517241379,
"acc_stderr": 0.03289445522127401,
"acc_norm": 0.8068965517241379,
"acc_norm_stderr": 0.03289445522127401
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5767195767195767,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.5767195767195767,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.04375888492727059,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.04375888492727059
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8580645161290322,
"acc_stderr": 0.019853003676559757,
"acc_norm": 0.8580645161290322,
"acc_norm_stderr": 0.019853003676559757
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.03178529710642749,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.03178529710642749
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9151515151515152,
"acc_stderr": 0.02175938534083591,
"acc_norm": 0.9151515151515152,
"acc_norm_stderr": 0.02175938534083591
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199488,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7743589743589744,
"acc_stderr": 0.02119363252514854,
"acc_norm": 0.7743589743589744,
"acc_norm_stderr": 0.02119363252514854
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.03044452852881074,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.03044452852881074
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673957,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673957
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5695364238410596,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.5695364238410596,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334884,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334884
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.031141447823536037,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.031141447823536037
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9607843137254902,
"acc_stderr": 0.013623692819208815,
"acc_norm": 0.9607843137254902,
"acc_norm_stderr": 0.013623692819208815
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8340807174887892,
"acc_stderr": 0.024967553196547147,
"acc_norm": 0.8340807174887892,
"acc_norm_stderr": 0.024967553196547147
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597453,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597453
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.0314570385430625,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.0314570385430625
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9202453987730062,
"acc_stderr": 0.021284928419899075,
"acc_norm": 0.9202453987730062,
"acc_norm_stderr": 0.021284928419899075
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.044939490686135404,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.044939490686135404
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640404,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640404
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673144,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673144
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876345,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876345
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490714,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490714
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6748603351955307,
"acc_stderr": 0.015666542785053566,
"acc_norm": 0.6748603351955307,
"acc_norm_stderr": 0.015666542785053566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.0227337894054476,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.0227337894054476
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8488745980707395,
"acc_stderr": 0.02034274974442863,
"acc_norm": 0.8488745980707395,
"acc_norm_stderr": 0.02034274974442863
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.01924252622654455,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.01924252622654455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.028195534873966727,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.028195534873966727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6857887874837028,
"acc_stderr": 0.011855911587048224,
"acc_norm": 0.6857887874837028,
"acc_norm_stderr": 0.011855911587048224
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8713235294117647,
"acc_stderr": 0.020340173153899008,
"acc_norm": 0.8713235294117647,
"acc_norm_stderr": 0.020340173153899008
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.01530932926696914,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.01530932926696914
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8909090909090909,
"acc_stderr": 0.02986054477658413,
"acc_norm": 0.8909090909090909,
"acc_norm_stderr": 0.02986054477658413
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199173,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199173
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594197,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594197
},
"harness|hendrycksTest-virology|5": {
"acc": 0.8012048192771084,
"acc_stderr": 0.03106939026078942,
"acc_norm": 0.8012048192771084,
"acc_norm_stderr": 0.03106939026078942
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9415204678362573,
"acc_stderr": 0.017996678857280124,
"acc_norm": 0.9415204678362573,
"acc_norm_stderr": 0.017996678857280124
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.01602157061376854,
"mc2": 0.4556260164510823,
"mc2_stderr": 0.015092321286472773
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud | 2023-09-17T09:10:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jslin09/bloom-560m-finetuned-fraud
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jslin09/bloom-560m-finetuned-fraud](https://huggingface.co/jslin09/bloom-560m-finetuned-fraud)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T09:10:48.065151](https://huggingface.co/datasets/open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud/blob/main/results_2023-09-17T09-10-48.065151.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642965815,\n \"f1\": 0.0032707634228187916,\n\
\ \"f1_stderr\": 0.0005552444547661462,\n \"acc\": 0.24191002367797948,\n\
\ \"acc_stderr\": 0.0070225630654893005\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965815,\n\
\ \"f1\": 0.0032707634228187916,\n \"f1_stderr\": 0.0005552444547661462\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.48382004735595896,\n\
\ \"acc_stderr\": 0.014045126130978601\n }\n}\n```"
repo_url: https://huggingface.co/jslin09/bloom-560m-finetuned-fraud
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T09_10_48.065151
path:
- '**/details_harness|drop|3_2023-09-17T09-10-48.065151.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T09-10-48.065151.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T09_10_48.065151
path:
- '**/details_harness|gsm8k|5_2023-09-17T09-10-48.065151.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T09-10-48.065151.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:20:24.088120.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:20:24.088120.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T09_10_48.065151
path:
- '**/details_harness|winogrande|5_2023-09-17T09-10-48.065151.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T09-10-48.065151.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_20_24.088120
path:
- results_2023-08-17T18:20:24.088120.parquet
- split: 2023_09_17T09_10_48.065151
path:
- results_2023-09-17T09-10-48.065151.parquet
- split: latest
path:
- results_2023-09-17T09-10-48.065151.parquet
---
# Dataset Card for Evaluation run of jslin09/bloom-560m-finetuned-fraud
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jslin09/bloom-560m-finetuned-fraud
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jslin09/bloom-560m-finetuned-fraud](https://huggingface.co/jslin09/bloom-560m-finetuned-fraud) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T09:10:48.065151](https://huggingface.co/datasets/open-llm-leaderboard/details_jslin09__bloom-560m-finetuned-fraud/blob/main/results_2023-09-17T09-10-48.065151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965815,
"f1": 0.0032707634228187916,
"f1_stderr": 0.0005552444547661462,
"acc": 0.24191002367797948,
"acc_stderr": 0.0070225630654893005
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965815,
"f1": 0.0032707634228187916,
"f1_stderr": 0.0005552444547661462
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.48382004735595896,
"acc_stderr": 0.014045126130978601
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b | 2023-09-17T12:57:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LinkSoul/Chinese-Llama-2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LinkSoul/Chinese-Llama-2-7b](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T12:57:13.908145](https://huggingface.co/datasets/open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b/blob/main/results_2023-09-17T12-57-13.908145.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03271812080536913,\n\
\ \"em_stderr\": 0.0018218405030911197,\n \"f1\": 0.0883210989932887,\n\
\ \"f1_stderr\": 0.002162976048256438,\n \"acc\": 0.43625495385576474,\n\
\ \"acc_stderr\": 0.011101966395253314\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03271812080536913,\n \"em_stderr\": 0.0018218405030911197,\n\
\ \"f1\": 0.0883210989932887,\n \"f1_stderr\": 0.002162976048256438\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14480667172100076,\n \
\ \"acc_stderr\": 0.009693234799052694\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453934\n\
\ }\n}\n```"
repo_url: https://huggingface.co/LinkSoul/Chinese-Llama-2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T12_57_13.908145
path:
- '**/details_harness|drop|3_2023-09-17T12-57-13.908145.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T12-57-13.908145.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T12_57_13.908145
path:
- '**/details_harness|gsm8k|5_2023-09-17T12-57-13.908145.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T12-57-13.908145.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:27:31.562743.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:27:31.562743.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T12_57_13.908145
path:
- '**/details_harness|winogrande|5_2023-09-17T12-57-13.908145.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T12-57-13.908145.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_27_31.562743
path:
- results_2023-08-17T18:27:31.562743.parquet
- split: 2023_09_17T12_57_13.908145
path:
- results_2023-09-17T12-57-13.908145.parquet
- split: latest
path:
- results_2023-09-17T12-57-13.908145.parquet
---
# Dataset Card for Evaluation run of LinkSoul/Chinese-Llama-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LinkSoul/Chinese-Llama-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LinkSoul/Chinese-Llama-2-7b](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T12:57:13.908145](https://huggingface.co/datasets/open-llm-leaderboard/details_LinkSoul__Chinese-Llama-2-7b/blob/main/results_2023-09-17T12-57-13.908145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03271812080536913,
"em_stderr": 0.0018218405030911197,
"f1": 0.0883210989932887,
"f1_stderr": 0.002162976048256438,
"acc": 0.43625495385576474,
"acc_stderr": 0.011101966395253314
},
"harness|drop|3": {
"em": 0.03271812080536913,
"em_stderr": 0.0018218405030911197,
"f1": 0.0883210989932887,
"f1_stderr": 0.002162976048256438
},
"harness|gsm8k|5": {
"acc": 0.14480667172100076,
"acc_stderr": 0.009693234799052694
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2 | 2023-08-27T12:41:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FelixChao/llama2-13b-math1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/llama2-13b-math1.2](https://huggingface.co/FelixChao/llama2-13b-math1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T11:24:31.239858](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2/blob/main/results_2023-08-18T11%3A24%3A31.239858.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5316652022369519,\n\
\ \"acc_stderr\": 0.03464809198436301,\n \"acc_norm\": 0.5358552773178059,\n\
\ \"acc_norm_stderr\": 0.034630143612028014,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4829988104244731,\n\
\ \"mc2_stderr\": 0.01529285018066333\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5204778156996587,\n \"acc_stderr\": 0.014599131353035009,\n\
\ \"acc_norm\": 0.5708191126279863,\n \"acc_norm_stderr\": 0.014464085894870653\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6092411870145389,\n\
\ \"acc_stderr\": 0.004869232758103326,\n \"acc_norm\": 0.8061143198566023,\n\
\ \"acc_norm_stderr\": 0.003945324248503059\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n\
\ \"acc_stderr\": 0.02762171783290703,\n \"acc_norm\": 0.6193548387096774,\n\
\ \"acc_norm_stderr\": 0.02762171783290703\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.034767257476490364,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.034767257476490364\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.032752644677915166,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.032752644677915166\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46923076923076923,\n \"acc_stderr\": 0.02530295889085015,\n\
\ \"acc_norm\": 0.46923076923076923,\n \"acc_norm_stderr\": 0.02530295889085015\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7027522935779816,\n \"acc_stderr\": 0.01959570722464353,\n \"\
acc_norm\": 0.7027522935779816,\n \"acc_norm_stderr\": 0.01959570722464353\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\"\
: 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n\
\ \"acc_stderr\": 0.030190282453501947,\n \"acc_norm\": 0.7549019607843137,\n\
\ \"acc_norm_stderr\": 0.030190282453501947\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658335,\n\
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392905,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392905\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7254150702426565,\n\
\ \"acc_stderr\": 0.015959829933084025,\n \"acc_norm\": 0.7254150702426565,\n\
\ \"acc_norm_stderr\": 0.015959829933084025\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.02622615860512466,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.02622615860512466\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.311731843575419,\n\
\ \"acc_stderr\": 0.015491756531894638,\n \"acc_norm\": 0.311731843575419,\n\
\ \"acc_norm_stderr\": 0.015491756531894638\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124764,\n\
\ \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124764\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480617,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480617\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n\
\ \"acc_stderr\": 0.012337391684530314,\n \"acc_norm\": 0.3709256844850065,\n\
\ \"acc_norm_stderr\": 0.012337391684530314\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.03030625772246832,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.03030625772246832\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5245098039215687,\n \"acc_stderr\": 0.020203517280261436,\n \
\ \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.020203517280261436\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287247,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287247\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4829988104244731,\n\
\ \"mc2_stderr\": 0.01529285018066333\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/llama2-13b-math1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|arc:challenge|25_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hellaswag|10_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:24:31.239858.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T11:24:31.239858.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T11:24:31.239858.parquet'
- config_name: results
data_files:
- split: 2023_08_18T11_24_31.239858
path:
- results_2023-08-18T11:24:31.239858.parquet
- split: latest
path:
- results_2023-08-18T11:24:31.239858.parquet
---
# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FelixChao/llama2-13b-math1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FelixChao/llama2-13b-math1.2](https://huggingface.co/FelixChao/llama2-13b-math1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T11:24:31.239858](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.2/blob/main/results_2023-08-18T11%3A24%3A31.239858.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5316652022369519,
"acc_stderr": 0.03464809198436301,
"acc_norm": 0.5358552773178059,
"acc_norm_stderr": 0.034630143612028014,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4829988104244731,
"mc2_stderr": 0.01529285018066333
},
"harness|arc:challenge|25": {
"acc": 0.5204778156996587,
"acc_stderr": 0.014599131353035009,
"acc_norm": 0.5708191126279863,
"acc_norm_stderr": 0.014464085894870653
},
"harness|hellaswag|10": {
"acc": 0.6092411870145389,
"acc_stderr": 0.004869232758103326,
"acc_norm": 0.8061143198566023,
"acc_norm_stderr": 0.003945324248503059
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.02762171783290703,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.02762171783290703
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.034767257476490364,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.034767257476490364
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.032752644677915166,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.032752644677915166
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46923076923076923,
"acc_stderr": 0.02530295889085015,
"acc_norm": 0.46923076923076923,
"acc_norm_stderr": 0.02530295889085015
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7027522935779816,
"acc_stderr": 0.01959570722464353,
"acc_norm": 0.7027522935779816,
"acc_norm_stderr": 0.01959570722464353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392905,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392905
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7254150702426565,
"acc_stderr": 0.015959829933084025,
"acc_norm": 0.7254150702426565,
"acc_norm_stderr": 0.015959829933084025
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.02622615860512466,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.02622615860512466
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.311731843575419,
"acc_stderr": 0.015491756531894638,
"acc_norm": 0.311731843575419,
"acc_norm_stderr": 0.015491756531894638
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5895061728395061,
"acc_stderr": 0.027371350925124764,
"acc_norm": 0.5895061728395061,
"acc_norm_stderr": 0.027371350925124764
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480617,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480617
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.012337391684530314,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.012337391684530314
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.03030625772246832,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.03030625772246832
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.020203517280261436,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.020203517280261436
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287247,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287247
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4829988104244731,
"mc2_stderr": 0.01529285018066333
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FelixChao__vicuna-7B-physics | 2023-08-27T12:41:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FelixChao/vicuna-7B-physics
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/vicuna-7B-physics](https://huggingface.co/FelixChao/vicuna-7B-physics)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__vicuna-7B-physics\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T10:17:03.743373](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__vicuna-7B-physics/blob/main/results_2023-08-18T10%3A17%3A03.743373.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4673159708306307,\n\
\ \"acc_stderr\": 0.03524013505275528,\n \"acc_norm\": 0.4712606198259356,\n\
\ \"acc_norm_stderr\": 0.03523003310917091,\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920612,\n \"mc2\": 0.493053362344385,\n\
\ \"mc2_stderr\": 0.015349106505424853\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4496587030716723,\n \"acc_stderr\": 0.014537144444284732,\n\
\ \"acc_norm\": 0.4948805460750853,\n \"acc_norm_stderr\": 0.01461062489030916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5713005377414858,\n\
\ \"acc_stderr\": 0.004938787067611806,\n \"acc_norm\": 0.7588129854610636,\n\
\ \"acc_norm_stderr\": 0.004269291950109924\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490437,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490437\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669416,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669416\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4967741935483871,\n \"acc_stderr\": 0.02844341422643831,\n \"\
acc_norm\": 0.4967741935483871,\n \"acc_norm_stderr\": 0.02844341422643831\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.034889016168527305,\n \"\
acc_norm\": 0.601010101010101,\n \"acc_norm_stderr\": 0.034889016168527305\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6373056994818653,\n \"acc_stderr\": 0.03469713791704372,\n\
\ \"acc_norm\": 0.6373056994818653,\n \"acc_norm_stderr\": 0.03469713791704372\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.0252544854247996,\n \
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.0252544854247996\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150016,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150016\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5761467889908257,\n \"acc_stderr\": 0.02118726320908754,\n \"\
acc_norm\": 0.5761467889908257,\n \"acc_norm_stderr\": 0.02118726320908754\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.03338473403207401,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.03338473403207401\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.553921568627451,\n \"acc_stderr\": 0.034888454513049734,\n \"\
acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.034888454513049734\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5991561181434599,\n \"acc_stderr\": 0.03190080389473235,\n \
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.03190080389473235\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787275,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787275\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.04777615181156739,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.04777615181156739\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n\
\ \"acc_stderr\": 0.030118210106942635,\n \"acc_norm\": 0.6965811965811965,\n\
\ \"acc_norm_stderr\": 0.030118210106942635\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.017268607560005787,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.017268607560005787\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.026907849856282542,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.026907849856282542\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n\
\ \"acc_stderr\": 0.02836504154256457,\n \"acc_norm\": 0.5241157556270096,\n\
\ \"acc_norm_stderr\": 0.02836504154256457\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323667,\n\
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323667\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34159061277705344,\n\
\ \"acc_stderr\": 0.012112391320842849,\n \"acc_norm\": 0.34159061277705344,\n\
\ \"acc_norm_stderr\": 0.012112391320842849\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42320261437908496,\n \"acc_stderr\": 0.01998780976948206,\n \
\ \"acc_norm\": 0.42320261437908496,\n \"acc_norm_stderr\": 0.01998780976948206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6081871345029239,\n \"acc_stderr\": 0.037439798259263996,\n\
\ \"acc_norm\": 0.6081871345029239,\n \"acc_norm_stderr\": 0.037439798259263996\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920612,\n \"mc2\": 0.493053362344385,\n\
\ \"mc2_stderr\": 0.015349106505424853\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/vicuna-7B-physics
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|arc:challenge|25_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hellaswag|10_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T10:17:03.743373.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T10:17:03.743373.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T10:17:03.743373.parquet'
- config_name: results
data_files:
- split: 2023_08_18T10_17_03.743373
path:
- results_2023-08-18T10:17:03.743373.parquet
- split: latest
path:
- results_2023-08-18T10:17:03.743373.parquet
---
# Dataset Card for Evaluation run of FelixChao/vicuna-7B-physics
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FelixChao/vicuna-7B-physics
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FelixChao/vicuna-7B-physics](https://huggingface.co/FelixChao/vicuna-7B-physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__vicuna-7B-physics",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T10:17:03.743373](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__vicuna-7B-physics/blob/main/results_2023-08-18T10%3A17%3A03.743373.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4673159708306307,
"acc_stderr": 0.03524013505275528,
"acc_norm": 0.4712606198259356,
"acc_norm_stderr": 0.03523003310917091,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920612,
"mc2": 0.493053362344385,
"mc2_stderr": 0.015349106505424853
},
"harness|arc:challenge|25": {
"acc": 0.4496587030716723,
"acc_stderr": 0.014537144444284732,
"acc_norm": 0.4948805460750853,
"acc_norm_stderr": 0.01461062489030916
},
"harness|hellaswag|10": {
"acc": 0.5713005377414858,
"acc_stderr": 0.004938787067611806,
"acc_norm": 0.7588129854610636,
"acc_norm_stderr": 0.004269291950109924
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669416,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669416
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.041349130183033156,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.041349130183033156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4967741935483871,
"acc_stderr": 0.02844341422643831,
"acc_norm": 0.4967741935483871,
"acc_norm_stderr": 0.02844341422643831
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.034889016168527305,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.034889016168527305
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6373056994818653,
"acc_stderr": 0.03469713791704372,
"acc_norm": 0.6373056994818653,
"acc_norm_stderr": 0.03469713791704372
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.0252544854247996,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.0252544854247996
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5761467889908257,
"acc_stderr": 0.02118726320908754,
"acc_norm": 0.5761467889908257,
"acc_norm_stderr": 0.02118726320908754
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.034888454513049734,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.034888454513049734
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.03190080389473235,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.03190080389473235
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319772,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319772
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787275,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787275
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.04777615181156739,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.04777615181156739
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.030118210106942635,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.030118210106942635
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.017268607560005787,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.017268607560005787
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.02836504154256457,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.02836504154256457
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.027801656212323667,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.027801656212323667
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34159061277705344,
"acc_stderr": 0.012112391320842849,
"acc_norm": 0.34159061277705344,
"acc_norm_stderr": 0.012112391320842849
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42320261437908496,
"acc_stderr": 0.01998780976948206,
"acc_norm": 0.42320261437908496,
"acc_norm_stderr": 0.01998780976948206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6081871345029239,
"acc_stderr": 0.037439798259263996,
"acc_norm": 0.6081871345029239,
"acc_norm_stderr": 0.037439798259263996
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920612,
"mc2": 0.493053362344385,
"mc2_stderr": 0.015349106505424853
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1 | 2023-09-18T01:52:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FelixChao/llama2-13b-math1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/llama2-13b-math1.1](https://huggingface.co/FelixChao/llama2-13b-math1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T01:52:04.935110](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1/blob/main/results_2023-09-18T01-52-04.935110.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.059668624161073824,\n\
\ \"em_stderr\": 0.002425789584380313,\n \"f1\": 0.13621015100671116,\n\
\ \"f1_stderr\": 0.002790966839907262,\n \"acc\": 0.42716702579565374,\n\
\ \"acc_stderr\": 0.01036106550745729\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.059668624161073824,\n \"em_stderr\": 0.002425789584380313,\n\
\ \"f1\": 0.13621015100671116,\n \"f1_stderr\": 0.002790966839907262\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \
\ \"acc_stderr\": 0.00851098256552048\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394103\n\
\ }\n}\n```"
repo_url: https://huggingface.co/FelixChao/llama2-13b-math1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|arc:challenge|25_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T01_52_04.935110
path:
- '**/details_harness|drop|3_2023-09-18T01-52-04.935110.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T01-52-04.935110.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T01_52_04.935110
path:
- '**/details_harness|gsm8k|5_2023-09-18T01-52-04.935110.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T01-52-04.935110.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hellaswag|10_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T11:29:01.098404.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T11:29:01.098404.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T01_52_04.935110
path:
- '**/details_harness|winogrande|5_2023-09-18T01-52-04.935110.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T01-52-04.935110.parquet'
- config_name: results
data_files:
- split: 2023_08_18T11_29_01.098404
path:
- results_2023-08-18T11:29:01.098404.parquet
- split: 2023_09_18T01_52_04.935110
path:
- results_2023-09-18T01-52-04.935110.parquet
- split: latest
path:
- results_2023-09-18T01-52-04.935110.parquet
---
# Dataset Card for Evaluation run of FelixChao/llama2-13b-math1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FelixChao/llama2-13b-math1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FelixChao/llama2-13b-math1.1](https://huggingface.co/FelixChao/llama2-13b-math1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T01:52:04.935110](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__llama2-13b-math1.1/blob/main/results_2023-09-18T01-52-04.935110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.059668624161073824,
"em_stderr": 0.002425789584380313,
"f1": 0.13621015100671116,
"f1_stderr": 0.002790966839907262,
"acc": 0.42716702579565374,
"acc_stderr": 0.01036106550745729
},
"harness|drop|3": {
"em": 0.059668624161073824,
"em_stderr": 0.002425789584380313,
"f1": 0.13621015100671116,
"f1_stderr": 0.002790966839907262
},
"harness|gsm8k|5": {
"acc": 0.1068991660348749,
"acc_stderr": 0.00851098256552048
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394103
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3 | 2023-09-23T08:56:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nkpz/llama2-22b-daydreamer-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nkpz/llama2-22b-daydreamer-v3](https://huggingface.co/nkpz/llama2-22b-daydreamer-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T08:56:42.787237](https://huggingface.co/datasets/open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3/blob/main/results_2023-09-23T08-56-42.787237.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006606543624161074,\n\
\ \"em_stderr\": 0.0008296357389921868,\n \"f1\": 0.08847525167785215,\n\
\ \"f1_stderr\": 0.0017746482079898484,\n \"acc\": 0.38635706776019,\n\
\ \"acc_stderr\": 0.008833441686995644\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006606543624161074,\n \"em_stderr\": 0.0008296357389921868,\n\
\ \"f1\": 0.08847525167785215,\n \"f1_stderr\": 0.0017746482079898484\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03790750568612585,\n \
\ \"acc_stderr\": 0.0052603339077984266\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.012406549466192861\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nkpz/llama2-22b-daydreamer-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T08_56_42.787237
path:
- '**/details_harness|drop|3_2023-09-23T08-56-42.787237.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T08-56-42.787237.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T08_56_42.787237
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-56-42.787237.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-56-42.787237.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:34:13.922429.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:34:13.922429.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T08_56_42.787237
path:
- '**/details_harness|winogrande|5_2023-09-23T08-56-42.787237.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T08-56-42.787237.parquet'
- config_name: results
data_files:
- split: 2023_08_17T14_34_13.922429
path:
- results_2023-08-17T14:34:13.922429.parquet
- split: 2023_09_23T08_56_42.787237
path:
- results_2023-09-23T08-56-42.787237.parquet
- split: latest
path:
- results_2023-09-23T08-56-42.787237.parquet
---
# Dataset Card for Evaluation run of nkpz/llama2-22b-daydreamer-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nkpz/llama2-22b-daydreamer-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nkpz/llama2-22b-daydreamer-v3](https://huggingface.co/nkpz/llama2-22b-daydreamer-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T08:56:42.787237](https://huggingface.co/datasets/open-llm-leaderboard/details_nkpz__llama2-22b-daydreamer-v3/blob/main/results_2023-09-23T08-56-42.787237.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006606543624161074,
"em_stderr": 0.0008296357389921868,
"f1": 0.08847525167785215,
"f1_stderr": 0.0017746482079898484,
"acc": 0.38635706776019,
"acc_stderr": 0.008833441686995644
},
"harness|drop|3": {
"em": 0.006606543624161074,
"em_stderr": 0.0008296357389921868,
"f1": 0.08847525167785215,
"f1_stderr": 0.0017746482079898484
},
"harness|gsm8k|5": {
"acc": 0.03790750568612585,
"acc_stderr": 0.0052603339077984266
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.012406549466192861
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Voicelab__trurl-2-13b | 2023-08-28T20:57:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 119 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Voicelab__trurl-2-13b\"\
,\n\t\"original_mmlu_world_religions_5\",\n\tsplit=\"train\")\n```\n\n## Latest\
\ results\n\nThese are the [latest results from run 2023-08-28T20:57:35.828044](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b/blob/main/results_2023-08-28T20%3A57%3A35.828044.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7857118464693361,\n\
\ \"acc_stderr\": 0.028708315182495378\n },\n \"original|mmlu:abstract_algebra|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919\n },\n\
\ \"original|mmlu:anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \
\ \"acc_stderr\": 0.03749850709174022\n },\n \"original|mmlu:astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123398\n\
\ },\n \"original|mmlu:business_ethics|5\": {\n \"acc\": 0.77,\n \
\ \"acc_stderr\": 0.04229525846816506\n },\n \"original|mmlu:clinical_knowledge|5\"\
: {\n \"acc\": 0.8641509433962264,\n \"acc_stderr\": 0.02108730862243985\n\
\ },\n \"original|mmlu:college_biology|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.02891980295613489\n },\n \"original|mmlu:college_chemistry|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632\n },\n\
\ \"original|mmlu:college_computer_science|5\": {\n \"acc\": 0.68,\n \
\ \"acc_stderr\": 0.046882617226215034\n },\n \"original|mmlu:college_mathematics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05\n },\n \"original|mmlu:college_medicine|5\"\
: {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.030299574664788147\n\
\ },\n \"original|mmlu:college_physics|5\": {\n \"acc\": 0.6176470588235294,\n\
\ \"acc_stderr\": 0.04835503696107223\n },\n \"original|mmlu:computer_security|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704\n },\n\
\ \"original|mmlu:conceptual_physics|5\": {\n \"acc\": 0.7319148936170212,\n\
\ \"acc_stderr\": 0.028957342788342347\n },\n \"original|mmlu:econometrics|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04677473004491199\n\
\ },\n \"original|mmlu:electrical_engineering|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.03333333333333329\n },\n \"original|mmlu:elementary_mathematics|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.02548718714785938\n\
\ },\n \"original|mmlu:formal_logic|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04360314860077459\n },\n \"original|mmlu:global_facts|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05\n },\n \"original|mmlu:high_school_biology|5\"\
: {\n \"acc\": 0.8548387096774194,\n \"acc_stderr\": 0.02003956362805329\n\
\ },\n \"original|mmlu:high_school_chemistry|5\": {\n \"acc\": 0.7142857142857143,\n\
\ \"acc_stderr\": 0.03178529710642749\n },\n \"original|mmlu:high_school_computer_science|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446\n },\n\
\ \"original|mmlu:high_school_european_history|5\": {\n \"acc\": 0.9151515151515152,\n\
\ \"acc_stderr\": 0.02175938534083591\n },\n \"original|mmlu:high_school_geography|5\"\
: {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536\n\
\ },\n \"original|mmlu:high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528\n\
\ },\n \"original|mmlu:high_school_macroeconomics|5\": {\n \"acc\"\
: 0.782051282051282,\n \"acc_stderr\": 0.020932445774463196\n },\n \
\ \"original|mmlu:high_school_mathematics|5\": {\n \"acc\": 0.5148148148148148,\n\
\ \"acc_stderr\": 0.030472153249328598\n },\n \"original|mmlu:high_school_microeconomics|5\"\
: {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.022730208119306542\n\
\ },\n \"original|mmlu:high_school_physics|5\": {\n \"acc\": 0.5761589403973509,\n\
\ \"acc_stderr\": 0.04034846678603395\n },\n \"original|mmlu:high_school_psychology|5\"\
: {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116253\n\
\ },\n \"original|mmlu:high_school_statistics|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03099866630456053\n },\n \"original|mmlu:high_school_us_history|5\"\
: {\n \"acc\": 0.9558823529411765,\n \"acc_stderr\": 0.014413198705704807\n\
\ },\n \"original|mmlu:high_school_world_history|5\": {\n \"acc\":\
\ 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671\n },\n \
\ \"original|mmlu:human_aging|5\": {\n \"acc\": 0.8430493273542601,\n \
\ \"acc_stderr\": 0.024413587174907395\n },\n \"original|mmlu:human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453\n\
\ },\n \"original|mmlu:international_law|5\": {\n \"acc\": 0.9008264462809917,\n\
\ \"acc_stderr\": 0.02728524631275896\n },\n \"original|mmlu:jurisprudence|5\"\
: {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.0314570385430625\n\
\ },\n \"original|mmlu:logical_fallacies|5\": {\n \"acc\": 0.9141104294478528,\n\
\ \"acc_stderr\": 0.022014662933817535\n },\n \"original|mmlu:machine_learning|5\"\
: {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.04547960999764376\n\
\ },\n \"original|mmlu:management|5\": {\n \"acc\": 0.8932038834951457,\n\
\ \"acc_stderr\": 0.030581088928331366\n },\n \"original|mmlu:marketing|5\"\
: {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874\n\
\ },\n \"original|mmlu:medical_genetics|5\": {\n \"acc\": 0.84,\n \
\ \"acc_stderr\": 0.03684529491774709\n },\n \"original|mmlu:miscellaneous|5\"\
: {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.010524031079055834\n\
\ },\n \"original|mmlu:moral_disputes|5\": {\n \"acc\": 0.8352601156069365,\n\
\ \"acc_stderr\": 0.019971040982442272\n },\n \"original|mmlu:moral_scenarios|5\"\
: {\n \"acc\": 0.6715083798882682,\n \"acc_stderr\": 0.015707935398496457\n\
\ },\n \"original|mmlu:nutrition|5\": {\n \"acc\": 0.8104575163398693,\n\
\ \"acc_stderr\": 0.022442358263336213\n },\n \"original|mmlu:philosophy|5\"\
: {\n \"acc\": 0.8520900321543409,\n \"acc_stderr\": 0.020163253806284108\n\
\ },\n \"original|mmlu:prehistory|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.01924252622654455\n },\n \"original|mmlu:professional_accounting|5\"\
: {\n \"acc\": 0.6524822695035462,\n \"acc_stderr\": 0.028406627809590954\n\
\ },\n \"original|mmlu:professional_law|5\": {\n \"acc\": 0.6844850065189049,\n\
\ \"acc_stderr\": 0.01186918484305864\n },\n \"original|mmlu:professional_medicine|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.020089743302935947\n \
\ },\n \"original|mmlu:professional_psychology|5\": {\n \"acc\": 0.8300653594771242,\n\
\ \"acc_stderr\": 0.01519415311318472\n },\n \"original|mmlu:public_relations|5\"\
: {\n \"acc\": 0.8818181818181818,\n \"acc_stderr\": 0.030920863185231417\n\
\ },\n \"original|mmlu:security_studies|5\": {\n \"acc\": 0.8,\n \
\ \"acc_stderr\": 0.025607375986579157\n },\n \"original|mmlu:sociology|5\"\
: {\n \"acc\": 0.9104477611940298,\n \"acc_stderr\": 0.02019067053502793\n\
\ },\n \"original|mmlu:us_foreign_policy|5\": {\n \"acc\": 0.95,\n\
\ \"acc_stderr\": 0.021904291355759033\n },\n \"original|mmlu:virology|5\"\
: {\n \"acc\": 0.7951807228915663,\n \"acc_stderr\": 0.03141784291663925\n\
\ },\n \"original|mmlu:world_religions|5\": {\n \"acc\": 0.9415204678362573,\n\
\ \"acc_stderr\": 0.017996678857280124\n }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:17:14.973994.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:17:14.973994.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:17:14.973994.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:57:35.828044.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_57_35.828044
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:57:35.828044.parquet'
- config_name: results
data_files:
- split: 2023_08_17T15_17_14.973994
path:
- results_2023-08-17T15:17:14.973994.parquet
- split: 2023_08_28T20_57_35.828044
path:
- results_2023-08-28T20:57:35.828044.parquet
- split: latest
path:
- results_2023-08-28T20:57:35.828044.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 119 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Voicelab__trurl-2-13b",
"original_mmlu_world_religions_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T20:57:35.828044](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b/blob/main/results_2023-08-28T20%3A57%3A35.828044.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7857118464693361,
"acc_stderr": 0.028708315182495378
},
"original|mmlu:abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919
},
"original|mmlu:anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174022
},
"original|mmlu:astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123398
},
"original|mmlu:business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506
},
"original|mmlu:clinical_knowledge|5": {
"acc": 0.8641509433962264,
"acc_stderr": 0.02108730862243985
},
"original|mmlu:college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.02891980295613489
},
"original|mmlu:college_chemistry|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632
},
"original|mmlu:college_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034
},
"original|mmlu:college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.05
},
"original|mmlu:college_medicine|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.030299574664788147
},
"original|mmlu:college_physics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.04835503696107223
},
"original|mmlu:computer_security|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704
},
"original|mmlu:conceptual_physics|5": {
"acc": 0.7319148936170212,
"acc_stderr": 0.028957342788342347
},
"original|mmlu:econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199
},
"original|mmlu:electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.03333333333333329
},
"original|mmlu:elementary_mathematics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.02548718714785938
},
"original|mmlu:formal_logic|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04360314860077459
},
"original|mmlu:global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05
},
"original|mmlu:high_school_biology|5": {
"acc": 0.8548387096774194,
"acc_stderr": 0.02003956362805329
},
"original|mmlu:high_school_chemistry|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.03178529710642749
},
"original|mmlu:high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446
},
"original|mmlu:high_school_european_history|5": {
"acc": 0.9151515151515152,
"acc_stderr": 0.02175938534083591
},
"original|mmlu:high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536
},
"original|mmlu:high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528
},
"original|mmlu:high_school_macroeconomics|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.020932445774463196
},
"original|mmlu:high_school_mathematics|5": {
"acc": 0.5148148148148148,
"acc_stderr": 0.030472153249328598
},
"original|mmlu:high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.022730208119306542
},
"original|mmlu:high_school_physics|5": {
"acc": 0.5761589403973509,
"acc_stderr": 0.04034846678603395
},
"original|mmlu:high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116253
},
"original|mmlu:high_school_statistics|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03099866630456053
},
"original|mmlu:high_school_us_history|5": {
"acc": 0.9558823529411765,
"acc_stderr": 0.014413198705704807
},
"original|mmlu:high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671
},
"original|mmlu:human_aging|5": {
"acc": 0.8430493273542601,
"acc_stderr": 0.024413587174907395
},
"original|mmlu:human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597453
},
"original|mmlu:international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896
},
"original|mmlu:jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.0314570385430625
},
"original|mmlu:logical_fallacies|5": {
"acc": 0.9141104294478528,
"acc_stderr": 0.022014662933817535
},
"original|mmlu:machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04547960999764376
},
"original|mmlu:management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366
},
"original|mmlu:marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874
},
"original|mmlu:medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709
},
"original|mmlu:miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.010524031079055834
},
"original|mmlu:moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442272
},
"original|mmlu:moral_scenarios|5": {
"acc": 0.6715083798882682,
"acc_stderr": 0.015707935398496457
},
"original|mmlu:nutrition|5": {
"acc": 0.8104575163398693,
"acc_stderr": 0.022442358263336213
},
"original|mmlu:philosophy|5": {
"acc": 0.8520900321543409,
"acc_stderr": 0.020163253806284108
},
"original|mmlu:prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.01924252622654455
},
"original|mmlu:professional_accounting|5": {
"acc": 0.6524822695035462,
"acc_stderr": 0.028406627809590954
},
"original|mmlu:professional_law|5": {
"acc": 0.6844850065189049,
"acc_stderr": 0.01186918484305864
},
"original|mmlu:professional_medicine|5": {
"acc": 0.875,
"acc_stderr": 0.020089743302935947
},
"original|mmlu:professional_psychology|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.01519415311318472
},
"original|mmlu:public_relations|5": {
"acc": 0.8818181818181818,
"acc_stderr": 0.030920863185231417
},
"original|mmlu:security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157
},
"original|mmlu:sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.02019067053502793
},
"original|mmlu:us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759033
},
"original|mmlu:virology|5": {
"acc": 0.7951807228915663,
"acc_stderr": 0.03141784291663925
},
"original|mmlu:world_religions|5": {
"acc": 0.9415204678362573,
"acc_stderr": 0.017996678857280124
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Voicelab__trurl-2-7b | 2023-08-27T12:41:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Voicelab/trurl-2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Voicelab/trurl-2-7b](https://huggingface.co/Voicelab/trurl-2-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Voicelab__trurl-2-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T14:14:32.422343](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-7b/blob/main/results_2023-08-17T14%3A14%3A32.422343.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5011844879329979,\n\
\ \"acc_stderr\": 0.03509934071413946,\n \"acc_norm\": 0.504850490041548,\n\
\ \"acc_norm_stderr\": 0.03508793737194848,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4541658189441706,\n\
\ \"mc2_stderr\": 0.01502523282080466\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.014611199329843777,\n\
\ \"acc_norm\": 0.5341296928327645,\n \"acc_norm_stderr\": 0.0145773113152311\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.568213503286198,\n\
\ \"acc_stderr\": 0.004943127583290922,\n \"acc_norm\": 0.7529376618203545,\n\
\ \"acc_norm_stderr\": 0.004304218408635195\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347368,\n \"\
acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347368\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n \"\
acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.03488901616852732,\n \"acc_norm\"\
: 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852732\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.032473902765696686,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.032473902765696686\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6862385321100918,\n \"acc_stderr\": 0.019894723341469116,\n \"\
acc_norm\": 0.6862385321100918,\n \"acc_norm_stderr\": 0.019894723341469116\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\
acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784701,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784701\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.033310925110381785,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.033310925110381785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.029614323690456655,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.029614323690456655\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.016740929047162706,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.016740929047162706\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.02686462436675666,\n\
\ \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.02686462436675666\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103986,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103986\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347663,\n\
\ \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347663\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.02840662780959095,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.02840662780959095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3644067796610169,\n\
\ \"acc_stderr\": 0.01229169498305648,\n \"acc_norm\": 0.3644067796610169,\n\
\ \"acc_norm_stderr\": 0.01229169498305648\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.03025437257397669,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03025437257397669\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4722222222222222,\n \"acc_stderr\": 0.020196594933541194,\n \
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.020196594933541194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894843,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894843\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4541658189441706,\n\
\ \"mc2_stderr\": 0.01502523282080466\n }\n}\n```"
repo_url: https://huggingface.co/Voicelab/trurl-2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T14:14:32.422343.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:14:32.422343.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T14:14:32.422343.parquet'
- config_name: results
data_files:
- split: 2023_08_17T14_14_32.422343
path:
- results_2023-08-17T14:14:32.422343.parquet
- split: latest
path:
- results_2023-08-17T14:14:32.422343.parquet
---
# Dataset Card for Evaluation run of Voicelab/trurl-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Voicelab/trurl-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Voicelab/trurl-2-7b](https://huggingface.co/Voicelab/trurl-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Voicelab__trurl-2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T14:14:32.422343](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-7b/blob/main/results_2023-08-17T14%3A14%3A32.422343.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5011844879329979,
"acc_stderr": 0.03509934071413946,
"acc_norm": 0.504850490041548,
"acc_norm_stderr": 0.03508793737194848,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4541658189441706,
"mc2_stderr": 0.01502523282080466
},
"harness|arc:challenge|25": {
"acc": 0.5025597269624573,
"acc_stderr": 0.014611199329843777,
"acc_norm": 0.5341296928327645,
"acc_norm_stderr": 0.0145773113152311
},
"harness|hellaswag|10": {
"acc": 0.568213503286198,
"acc_stderr": 0.004943127583290922,
"acc_norm": 0.7529376618203545,
"acc_norm_stderr": 0.004304218408635195
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347368,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347368
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852732,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852732
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.032473902765696686,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.032473902765696686
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6862385321100918,
"acc_stderr": 0.019894723341469116,
"acc_norm": 0.6862385321100918,
"acc_norm_stderr": 0.019894723341469116
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.03016513786784701,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.03016513786784701
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.033310925110381785,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.033310925110381785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.029614323690456655,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.029614323690456655
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.016740929047162706,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.016740929047162706
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.02686462436675666,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.02686462436675666
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103986,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103986
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759563,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347663,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347663
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3644067796610169,
"acc_stderr": 0.01229169498305648,
"acc_norm": 0.3644067796610169,
"acc_norm_stderr": 0.01229169498305648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03025437257397669,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03025437257397669
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.020196594933541194,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.020196594933541194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.031001209039894843,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.031001209039894843
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4541658189441706,
"mc2_stderr": 0.01502523282080466
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b | 2023-08-27T12:41:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-research-oasst1-llama-65b](https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T22:10:29.981773](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b/blob/main/results_2023-08-17T22%3A10%3A29.981773.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6359037673839993,\n\
\ \"acc_stderr\": 0.0329346816196445,\n \"acc_norm\": 0.6396809356138717,\n\
\ \"acc_norm_stderr\": 0.03290965482744071,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.48845185520886875,\n\
\ \"mc2_stderr\": 0.014057830912491135\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979275,\n\
\ \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.01396014260059868\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6664011153156741,\n\
\ \"acc_stderr\": 0.004705347137699622,\n \"acc_norm\": 0.8593905596494722,\n\
\ \"acc_norm_stderr\": 0.0034690778470563765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101737,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830513,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n\
\ \"acc_stderr\": 0.016197807956848043,\n \"acc_norm\": 0.8275229357798165,\n\
\ \"acc_norm_stderr\": 0.016197807956848043\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931055,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931055\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n\
\ \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n\
\ \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906497,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906497\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.0247238615047717,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.0247238615047717\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904212,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904212\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4830508474576271,\n\
\ \"acc_stderr\": 0.01276289688921086,\n \"acc_norm\": 0.4830508474576271,\n\
\ \"acc_norm_stderr\": 0.01276289688921086\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242304,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242304\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.48845185520886875,\n\
\ \"mc2_stderr\": 0.014057830912491135\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|arc:challenge|25_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hellaswag|10_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:10:29.981773.parquet'
- config_name: results
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- results_2023-08-17T17:53:50.635044.parquet
- split: 2023_08_17T22_10_29.981773
path:
- results_2023-08-17T22:10:29.981773.parquet
- split: latest
path:
- results_2023-08-17T22:10:29.981773.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-research-oasst1-llama-65b](https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T22:10:29.981773](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b/blob/main/results_2023-08-17T22%3A10%3A29.981773.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6359037673839993,
"acc_stderr": 0.0329346816196445,
"acc_norm": 0.6396809356138717,
"acc_norm_stderr": 0.03290965482744071,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.48845185520886875,
"mc2_stderr": 0.014057830912491135
},
"harness|arc:challenge|25": {
"acc": 0.6177474402730375,
"acc_stderr": 0.014200454049979275,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.01396014260059868
},
"harness|hellaswag|10": {
"acc": 0.6664011153156741,
"acc_stderr": 0.004705347137699622,
"acc_norm": 0.8593905596494722,
"acc_norm_stderr": 0.0034690778470563765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830513,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931055,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931055
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876166,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4770949720670391,
"acc_stderr": 0.016704945740326188,
"acc_norm": 0.4770949720670391,
"acc_norm_stderr": 0.016704945740326188
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906497,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.0247238615047717,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.0247238615047717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4830508474576271,
"acc_stderr": 0.01276289688921086,
"acc_norm": 0.4830508474576271,
"acc_norm_stderr": 0.01276289688921086
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242304,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242304
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.48845185520886875,
"mc2_stderr": 0.014057830912491135
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16 | 2023-08-27T12:41:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CoolWP/llama-2-13b-guanaco-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CoolWP/llama-2-13b-guanaco-fp16](https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T18:49:30.894423](https://huggingface.co/datasets/open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16/blob/main/results_2023-08-17T18%3A49%3A30.894423.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5557402565625233,\n\
\ \"acc_stderr\": 0.03433097920024075,\n \"acc_norm\": 0.5600027152011281,\n\
\ \"acc_norm_stderr\": 0.03430992590405376,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43400538092704843,\n\
\ \"mc2_stderr\": 0.014284105671223521\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.615116510655248,\n\
\ \"acc_stderr\": 0.004855733568540267,\n \"acc_norm\": 0.8239394542919737,\n\
\ \"acc_norm_stderr\": 0.003800932770597752\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871137,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871137\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890474,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890474\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7586206896551724,\n\
\ \"acc_stderr\": 0.015302380123542108,\n \"acc_norm\": 0.7586206896551724,\n\
\ \"acc_norm_stderr\": 0.015302380123542108\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765408,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765408\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\
\ \"acc_stderr\": 0.01600698993480319,\n \"acc_norm\": 0.3553072625698324,\n\
\ \"acc_norm_stderr\": 0.01600698993480319\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347813,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722334,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.012599505608336461,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.012599505608336461\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \
\ \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43400538092704843,\n\
\ \"mc2_stderr\": 0.014284105671223521\n }\n}\n```"
repo_url: https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:49:30.894423.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:49:30.894423.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:49:30.894423.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_49_30.894423
path:
- results_2023-08-17T18:49:30.894423.parquet
- split: latest
path:
- results_2023-08-17T18:49:30.894423.parquet
---
# Dataset Card for Evaluation run of CoolWP/llama-2-13b-guanaco-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CoolWP/llama-2-13b-guanaco-fp16](https://huggingface.co/CoolWP/llama-2-13b-guanaco-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T18:49:30.894423](https://huggingface.co/datasets/open-llm-leaderboard/details_CoolWP__llama-2-13b-guanaco-fp16/blob/main/results_2023-08-17T18%3A49%3A30.894423.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5557402565625233,
"acc_stderr": 0.03433097920024075,
"acc_norm": 0.5600027152011281,
"acc_norm_stderr": 0.03430992590405376,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.43400538092704843,
"mc2_stderr": 0.014284105671223521
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526843,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.615116510655248,
"acc_stderr": 0.004855733568540267,
"acc_norm": 0.8239394542919737,
"acc_norm_stderr": 0.003800932770597752
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871137,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871137
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890474,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890474
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7586206896551724,
"acc_stderr": 0.015302380123542108,
"acc_norm": 0.7586206896551724,
"acc_norm_stderr": 0.015302380123542108
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765408,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3553072625698324,
"acc_stderr": 0.01600698993480319,
"acc_norm": 0.3553072625698324,
"acc_norm_stderr": 0.01600698993480319
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347813,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722334,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336461,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336461
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5408496732026143,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.5408496732026143,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.43400538092704843,
"mc2_stderr": 0.014284105671223521
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1 | 2023-09-17T02:48:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T02:48:34.876063](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1/blob/main/results_2023-09-17T02-48-34.876063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018141778523489933,\n\
\ \"em_stderr\": 0.0013667968592600823,\n \"f1\": 0.0824182046979865,\n\
\ \"f1_stderr\": 0.0019512337351707363,\n \"acc\": 0.30108941444123377,\n\
\ \"acc_stderr\": 0.0072592536452981875\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.018141778523489933,\n \"em_stderr\": 0.0013667968592600823,\n\
\ \"f1\": 0.0824182046979865,\n \"f1_stderr\": 0.0019512337351707363\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.000758150113722541\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873834\n\
\ }\n}\n```"
repo_url: https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T02_48_34.876063
path:
- '**/details_harness|drop|3_2023-09-17T02-48-34.876063.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T02-48-34.876063.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T02_48_34.876063
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-48-34.876063.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-48-34.876063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:06:24.257655.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:06:24.257655.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T02_48_34.876063
path:
- '**/details_harness|winogrande|5_2023-09-17T02-48-34.876063.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T02-48-34.876063.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_06_24.257655
path:
- results_2023-08-17T19:06:24.257655.parquet
- split: 2023_09_17T02_48_34.876063
path:
- results_2023-09-17T02-48-34.876063.parquet
- split: latest
path:
- results_2023-09-17T02-48-34.876063.parquet
---
# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T02:48:34.876063](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-FT-LoRA-8bit-test1/blob/main/results_2023-09-17T02-48-34.876063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.018141778523489933,
"em_stderr": 0.0013667968592600823,
"f1": 0.0824182046979865,
"f1_stderr": 0.0019512337351707363,
"acc": 0.30108941444123377,
"acc_stderr": 0.0072592536452981875
},
"harness|drop|3": {
"em": 0.018141778523489933,
"em_stderr": 0.0013667968592600823,
"f1": 0.0824182046979865,
"f1_stderr": 0.0019512337351707363
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.000758150113722541
},
"harness|winogrande|5": {
"acc": 0.601420678768745,
"acc_stderr": 0.013760357176873834
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft | 2023-08-27T12:41:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of davzoku/cria-llama2-7b-v1.3_peft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [davzoku/cria-llama2-7b-v1.3_peft](https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T19:00:21.145546](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft/blob/main/results_2023-08-17T19%3A00%3A21.145546.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46718327281573946,\n\
\ \"acc_stderr\": 0.03524459803640101,\n \"acc_norm\": 0.4707750480066663,\n\
\ \"acc_norm_stderr\": 0.03523208010539278,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4551798457212092,\n\
\ \"mc2_stderr\": 0.015805686232285752\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48890784982935154,\n \"acc_stderr\": 0.014607794914013057,\n\
\ \"acc_norm\": 0.514505119453925,\n \"acc_norm_stderr\": 0.014605241081370056\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5871340370444135,\n\
\ \"acc_stderr\": 0.004913429010559064,\n \"acc_norm\": 0.773451503684525,\n\
\ \"acc_norm_stderr\": 0.004177424913716106\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621502,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621502\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5283018867924528,\n \"acc_stderr\": 0.030723535249006107,\n\
\ \"acc_norm\": 0.5283018867924528,\n \"acc_norm_stderr\": 0.030723535249006107\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.3815028901734104,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n\
\ \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n\
\ \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331813,\n\
\ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331813\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6403669724770642,\n \"acc_stderr\": 0.02057523466012378,\n \"\
acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.02057523466012378\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329882,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329882\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6078431372549019,\n \"acc_stderr\": 0.03426712349247273,\n \"\
acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.03426712349247273\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5949367088607594,\n \"acc_stderr\": 0.03195514741370671,\n \
\ \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.03195514741370671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255097,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255097\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.04802694698258973,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.04802694698258973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.030351527323344937,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.030351527323344937\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n\
\ \"acc_stderr\": 0.01696703176641362,\n \"acc_norm\": 0.6577266922094508,\n\
\ \"acc_norm_stderr\": 0.01696703176641362\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258875,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331154,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331154\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.028217683556652315,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.028217683556652315\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5246913580246914,\n \"acc_stderr\": 0.027786800931427443,\n\
\ \"acc_norm\": 0.5246913580246914,\n \"acc_norm_stderr\": 0.027786800931427443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3363754889178618,\n\
\ \"acc_stderr\": 0.012067083079452224,\n \"acc_norm\": 0.3363754889178618,\n\
\ \"acc_norm_stderr\": 0.012067083079452224\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44607843137254904,\n \"acc_stderr\": 0.020109864547181354,\n \
\ \"acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.020109864547181354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.03512310964123935,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.03512310964123935\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4551798457212092,\n\
\ \"mc2_stderr\": 0.015805686232285752\n }\n}\n```"
repo_url: https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:00:21.145546.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:00:21.145546.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:00:21.145546.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_00_21.145546
path:
- results_2023-08-17T19:00:21.145546.parquet
- split: latest
path:
- results_2023-08-17T19:00:21.145546.parquet
---
# Dataset Card for Evaluation run of davzoku/cria-llama2-7b-v1.3_peft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [davzoku/cria-llama2-7b-v1.3_peft](https://huggingface.co/davzoku/cria-llama2-7b-v1.3_peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T19:00:21.145546](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__cria-llama2-7b-v1.3_peft/blob/main/results_2023-08-17T19%3A00%3A21.145546.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46718327281573946,
"acc_stderr": 0.03524459803640101,
"acc_norm": 0.4707750480066663,
"acc_norm_stderr": 0.03523208010539278,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4551798457212092,
"mc2_stderr": 0.015805686232285752
},
"harness|arc:challenge|25": {
"acc": 0.48890784982935154,
"acc_stderr": 0.014607794914013057,
"acc_norm": 0.514505119453925,
"acc_norm_stderr": 0.014605241081370056
},
"harness|hellaswag|10": {
"acc": 0.5871340370444135,
"acc_stderr": 0.004913429010559064,
"acc_norm": 0.773451503684525,
"acc_norm_stderr": 0.004177424913716106
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5283018867924528,
"acc_stderr": 0.030723535249006107,
"acc_norm": 0.5283018867924528,
"acc_norm_stderr": 0.030723535249006107
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.024962683564331813,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.024962683564331813
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.02057523466012378,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.02057523466012378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329882,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329882
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.03426712349247273,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.03426712349247273
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5949367088607594,
"acc_stderr": 0.03195514741370671,
"acc_norm": 0.5949367088607594,
"acc_norm_stderr": 0.03195514741370671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255097,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255097
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.04802694698258973,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.04802694698258973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344937,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344937
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.01696703176641362,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.01696703176641362
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.026854257928258875,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.026854257928258875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331154,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.028217683556652315,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.028217683556652315
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5246913580246914,
"acc_stderr": 0.027786800931427443,
"acc_norm": 0.5246913580246914,
"acc_norm_stderr": 0.027786800931427443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3363754889178618,
"acc_stderr": 0.012067083079452224,
"acc_norm": 0.3363754889178618,
"acc_norm_stderr": 0.012067083079452224
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.020109864547181354,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.020109864547181354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123935,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123935
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4551798457212092,
"mc2_stderr": 0.015805686232285752
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular | 2023-08-27T12:41:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/llama2-22b-blocktriangular
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/llama2-22b-blocktriangular](https://huggingface.co/chargoddard/llama2-22b-blocktriangular)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T16:15:19.075132](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular/blob/main/results_2023-08-17T16%3A15%3A19.075132.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5466072939361837,\n\
\ \"acc_stderr\": 0.03438387341082999,\n \"acc_norm\": 0.5507309274146267,\n\
\ \"acc_norm_stderr\": 0.034363434185586184,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875833,\n \"mc2\": 0.3922536374424786,\n\
\ \"mc2_stderr\": 0.013854973813723448\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5494880546075085,\n \"acc_stderr\": 0.014539646098471627,\n\
\ \"acc_norm\": 0.5827645051194539,\n \"acc_norm_stderr\": 0.014409825518403082\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6169089822744473,\n\
\ \"acc_stderr\": 0.00485146662360145,\n \"acc_norm\": 0.8269269069906393,\n\
\ \"acc_norm_stderr\": 0.00377537291428549\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.02698528957655274,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.02698528957655274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244442,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244442\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534738,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534738\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696041,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696041\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n\
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7394636015325671,\n\
\ \"acc_stderr\": 0.015696008563807068,\n \"acc_norm\": 0.7394636015325671,\n\
\ \"acc_norm_stderr\": 0.015696008563807068\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n\
\ \"acc_stderr\": 0.015183844307206151,\n \"acc_norm\": 0.2905027932960894,\n\
\ \"acc_norm_stderr\": 0.015183844307206151\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971642,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971642\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719967,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719967\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543458,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n\
\ \"acc_stderr\": 0.012602244505788236,\n \"acc_norm\": 0.41916558018252936,\n\
\ \"acc_norm_stderr\": 0.012602244505788236\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213535,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213535\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492534,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492534\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875833,\n \"mc2\": 0.3922536374424786,\n\
\ \"mc2_stderr\": 0.013854973813723448\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/llama2-22b-blocktriangular
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|arc:challenge|25_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hellaswag|10_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T16:15:19.075132.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T16:15:19.075132.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T16:15:19.075132.parquet'
- config_name: results
data_files:
- split: 2023_08_17T16_15_19.075132
path:
- results_2023-08-17T16:15:19.075132.parquet
- split: latest
path:
- results_2023-08-17T16:15:19.075132.parquet
---
# Dataset Card for Evaluation run of chargoddard/llama2-22b-blocktriangular
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/llama2-22b-blocktriangular
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/llama2-22b-blocktriangular](https://huggingface.co/chargoddard/llama2-22b-blocktriangular) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T16:15:19.075132](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama2-22b-blocktriangular/blob/main/results_2023-08-17T16%3A15%3A19.075132.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5466072939361837,
"acc_stderr": 0.03438387341082999,
"acc_norm": 0.5507309274146267,
"acc_norm_stderr": 0.034363434185586184,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875833,
"mc2": 0.3922536374424786,
"mc2_stderr": 0.013854973813723448
},
"harness|arc:challenge|25": {
"acc": 0.5494880546075085,
"acc_stderr": 0.014539646098471627,
"acc_norm": 0.5827645051194539,
"acc_norm_stderr": 0.014409825518403082
},
"harness|hellaswag|10": {
"acc": 0.6169089822744473,
"acc_stderr": 0.00485146662360145,
"acc_norm": 0.8269269069906393,
"acc_norm_stderr": 0.00377537291428549
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534738,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534738
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696041,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696041
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7394636015325671,
"acc_stderr": 0.015696008563807068,
"acc_norm": 0.7394636015325671,
"acc_norm_stderr": 0.015696008563807068
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.026074314851657083,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.026074314851657083
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.015183844307206151,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.015183844307206151
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971642,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971642
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719967,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543458,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788236,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213535,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213535
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492534,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492534
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875833,
"mc2": 0.3922536374424786,
"mc2_stderr": 0.013854973813723448
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k | 2023-08-27T12:41:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yihan6324/llama2-7b-instructmining-orca-40k](https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T00:53:27.654117](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k/blob/main/results_2023-08-18T00%3A53%3A27.654117.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4847120233306423,\n\
\ \"acc_stderr\": 0.03527399847085323,\n \"acc_norm\": 0.4884455010512822,\n\
\ \"acc_norm_stderr\": 0.035257414280301984,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5103220670450638,\n\
\ \"mc2_stderr\": 0.015890639542177364\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5674061433447098,\n \"acc_norm_stderr\": 0.014478005694182524\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6196972714598685,\n\
\ \"acc_stderr\": 0.004844690404713595,\n \"acc_norm\": 0.8024297948615814,\n\
\ \"acc_norm_stderr\": 0.0039735233080143454\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484865,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484865\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5645161290322581,\n \"acc_stderr\": 0.028206225591502737,\n \"\
acc_norm\": 0.5645161290322581,\n \"acc_norm_stderr\": 0.028206225591502737\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036543,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036543\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.03804913653971011,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.03804913653971011\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6623853211009174,\n \"acc_stderr\": 0.02027526598663891,\n \"\
acc_norm\": 0.6623853211009174,\n \"acc_norm_stderr\": 0.02027526598663891\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319355,\n \"\
acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319355\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n\
\ \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n\
\ \"acc_stderr\": 0.016808322261740467,\n \"acc_norm\": 0.6704980842911877,\n\
\ \"acc_norm_stderr\": 0.016808322261740467\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.47109826589595377,\n \"acc_stderr\": 0.02687408588351835,\n\
\ \"acc_norm\": 0.47109826589595377,\n \"acc_norm_stderr\": 0.02687408588351835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.014487500852850407,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.014487500852850407\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.02776768960683393,\n\
\ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.02776768960683393\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650154,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854924,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4395424836601307,\n \"acc_stderr\": 0.02007942040808792,\n \
\ \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.02007942040808792\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.03125127591089165,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.03125127591089165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979034,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979034\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5103220670450638,\n\
\ \"mc2_stderr\": 0.015890639542177364\n }\n}\n```"
repo_url: https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:53:27.654117.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:53:27.654117.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:53:27.654117.parquet'
- config_name: results
data_files:
- split: 2023_08_18T00_53_27.654117
path:
- results_2023-08-18T00:53:27.654117.parquet
- split: latest
path:
- results_2023-08-18T00:53:27.654117.parquet
---
# Dataset Card for Evaluation run of yihan6324/llama2-7b-instructmining-orca-40k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yihan6324/llama2-7b-instructmining-orca-40k](https://huggingface.co/yihan6324/llama2-7b-instructmining-orca-40k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T00:53:27.654117](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-7b-instructmining-orca-40k/blob/main/results_2023-08-18T00%3A53%3A27.654117.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4847120233306423,
"acc_stderr": 0.03527399847085323,
"acc_norm": 0.4884455010512822,
"acc_norm_stderr": 0.035257414280301984,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5103220670450638,
"mc2_stderr": 0.015890639542177364
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5674061433447098,
"acc_norm_stderr": 0.014478005694182524
},
"harness|hellaswag|10": {
"acc": 0.6196972714598685,
"acc_stderr": 0.004844690404713595,
"acc_norm": 0.8024297948615814,
"acc_norm_stderr": 0.0039735233080143454
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484865,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484865
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.028206225591502737,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.028206225591502737
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036543,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036543
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.03804913653971011,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.03804913653971011
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6623853211009174,
"acc_stderr": 0.02027526598663891,
"acc_norm": 0.6623853211009174,
"acc_norm_stderr": 0.02027526598663891
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.03087453753755362,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.03087453753755362
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.016808322261740467,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.016808322261740467
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.47109826589595377,
"acc_stderr": 0.02687408588351835,
"acc_norm": 0.47109826589595377,
"acc_norm_stderr": 0.02687408588351835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850407,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.02776768960683393,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.02776768960683393
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650154,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854924,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4395424836601307,
"acc_stderr": 0.02007942040808792,
"acc_norm": 0.4395424836601307,
"acc_norm_stderr": 0.02007942040808792
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.03125127591089165,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.03125127591089165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979034,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979034
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479637,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479637
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5103220670450638,
"mc2_stderr": 0.015890639542177364
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt | 2023-08-27T12:41:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yihan6324/llama2-13b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T15:06:33.773565](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt/blob/main/results_2023-08-17T15%3A06%3A33.773565.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5659221179678169,\n\
\ \"acc_stderr\": 0.03435610194042996,\n \"acc_norm\": 0.5698526105353496,\n\
\ \"acc_norm_stderr\": 0.03433517528186645,\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5244082340441981,\n\
\ \"mc2_stderr\": 0.015623466277080963\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6328420633339972,\n\
\ \"acc_stderr\": 0.004810449343572395,\n \"acc_norm\": 0.8306114319856602,\n\
\ \"acc_norm_stderr\": 0.003743281749373634\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"\
acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786753,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786753\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011748,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011748\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7669724770642202,\n \"acc_stderr\": 0.018125669180861507,\n \"\
acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.018125669180861507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145638,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145638\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.02390232554956041,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.02390232554956041\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395958,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395958\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306386,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306386\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n\
\ \"acc_stderr\": 0.01646320023811453,\n \"acc_norm\": 0.4122905027932961,\n\
\ \"acc_norm_stderr\": 0.01646320023811453\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891776,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891776\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347817,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347817\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811943,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811943\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271486,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271486\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.01997742260022747,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.01997742260022747\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5244082340441981,\n\
\ \"mc2_stderr\": 0.015623466277080963\n }\n}\n```"
repo_url: https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:06:33.773565.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:06:33.773565.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:06:33.773565.parquet'
- config_name: results
data_files:
- split: 2023_08_17T15_06_33.773565
path:
- results_2023-08-17T15:06:33.773565.parquet
- split: latest
path:
- results_2023-08-17T15:06:33.773565.parquet
---
# Dataset Card for Evaluation run of yihan6324/llama2-13b-instructmining-40k-sharegpt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yihan6324/llama2-13b-instructmining-40k-sharegpt](https://huggingface.co/yihan6324/llama2-13b-instructmining-40k-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T15:06:33.773565](https://huggingface.co/datasets/open-llm-leaderboard/details_yihan6324__llama2-13b-instructmining-40k-sharegpt/blob/main/results_2023-08-17T15%3A06%3A33.773565.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5659221179678169,
"acc_stderr": 0.03435610194042996,
"acc_norm": 0.5698526105353496,
"acc_norm_stderr": 0.03433517528186645,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5244082340441981,
"mc2_stderr": 0.015623466277080963
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809169
},
"harness|hellaswag|10": {
"acc": 0.6328420633339972,
"acc_stderr": 0.004810449343572395,
"acc_norm": 0.8306114319856602,
"acc_norm_stderr": 0.003743281749373634
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982026,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982026
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786753,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786753
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011748,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011748
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.018125669180861507,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.018125669180861507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956041,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956041
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395958,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395958
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306386,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306386
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.01646320023811453,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.01646320023811453
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891776,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891776
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347817,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347817
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811943,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03004261583271486,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03004261583271486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.01997742260022747,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.01997742260022747
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5244082340441981,
"mc2_stderr": 0.015623466277080963
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_heegyu__LIMA-13b-hf | 2023-08-27T12:41:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of heegyu/LIMA-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/LIMA-13b-hf](https://huggingface.co/heegyu/LIMA-13b-hf) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__LIMA-13b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T19:40:51.725558](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA-13b-hf/blob/main/results_2023-08-17T19%3A40%3A51.725558.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4903418992265545,\n\
\ \"acc_stderr\": 0.035105237487249016,\n \"acc_norm\": 0.4942155039802663,\n\
\ \"acc_norm_stderr\": 0.035086721546033074,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4175999088658601,\n\
\ \"mc2_stderr\": 0.013885459573047185\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.01455810654392407,\n\
\ \"acc_norm\": 0.5742320819112628,\n \"acc_norm_stderr\": 0.014449464278868812\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6197968532164907,\n\
\ \"acc_stderr\": 0.004844445265582656,\n \"acc_norm\": 0.8167695678151763,\n\
\ \"acc_norm_stderr\": 0.0038606469988972823\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808086,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808086\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\"\
: 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5353535353535354,\n \"acc_stderr\": 0.035534363688280626,\n \"\
acc_norm\": 0.5353535353535354,\n \"acc_norm_stderr\": 0.035534363688280626\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6735751295336787,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.6735751295336787,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126184,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126184\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4789915966386555,\n \"acc_stderr\": 0.032449808499900284,\n\
\ \"acc_norm\": 0.4789915966386555,\n \"acc_norm_stderr\": 0.032449808499900284\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6275229357798165,\n \"acc_stderr\": 0.020728368457638497,\n \"\
acc_norm\": 0.6275229357798165,\n \"acc_norm_stderr\": 0.020728368457638497\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5882352941176471,\n \"acc_stderr\": 0.03454236585380609,\n \"\
acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03454236585380609\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.029202540153431194,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.029202540153431194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.02687408588351835,\n\
\ \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.02687408588351835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364553,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364553\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.0285803410651383,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.0285803410651383\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n\
\ \"acc_stderr\": 0.028237769422085335,\n \"acc_norm\": 0.5530546623794212,\n\
\ \"acc_norm_stderr\": 0.028237769422085335\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5493827160493827,\n \"acc_stderr\": 0.027684721415656203,\n\
\ \"acc_norm\": 0.5493827160493827,\n \"acc_norm_stderr\": 0.027684721415656203\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0286638201471995,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0286638201471995\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854931,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854931\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150117,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.03445789964362749,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.03445789964362749\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4175999088658601,\n\
\ \"mc2_stderr\": 0.013885459573047185\n }\n}\n```"
repo_url: https://huggingface.co/heegyu/LIMA-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T19:40:51.725558.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:40:51.725558.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T19:40:51.725558.parquet'
- config_name: results
data_files:
- split: 2023_08_17T19_40_51.725558
path:
- results_2023-08-17T19:40:51.725558.parquet
- split: latest
path:
- results_2023-08-17T19:40:51.725558.parquet
---
# Dataset Card for Evaluation run of heegyu/LIMA-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/LIMA-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/LIMA-13b-hf](https://huggingface.co/heegyu/LIMA-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__LIMA-13b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T19:40:51.725558](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA-13b-hf/blob/main/results_2023-08-17T19%3A40%3A51.725558.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4903418992265545,
"acc_stderr": 0.035105237487249016,
"acc_norm": 0.4942155039802663,
"acc_norm_stderr": 0.035086721546033074,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4175999088658601,
"mc2_stderr": 0.013885459573047185
},
"harness|arc:challenge|25": {
"acc": 0.5426621160409556,
"acc_stderr": 0.01455810654392407,
"acc_norm": 0.5742320819112628,
"acc_norm_stderr": 0.014449464278868812
},
"harness|hellaswag|10": {
"acc": 0.6197968532164907,
"acc_stderr": 0.004844445265582656,
"acc_norm": 0.8167695678151763,
"acc_norm_stderr": 0.0038606469988972823
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808086,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808086
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5353535353535354,
"acc_stderr": 0.035534363688280626,
"acc_norm": 0.5353535353535354,
"acc_norm_stderr": 0.035534363688280626
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6735751295336787,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.6735751295336787,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126184,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126184
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4789915966386555,
"acc_stderr": 0.032449808499900284,
"acc_norm": 0.4789915966386555,
"acc_norm_stderr": 0.032449808499900284
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6275229357798165,
"acc_stderr": 0.020728368457638497,
"acc_norm": 0.6275229357798165,
"acc_norm_stderr": 0.020728368457638497
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03454236585380609,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03454236585380609
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.029202540153431194,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.029202540153431194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.02687408588351835,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.02687408588351835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364553,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364553
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.0285803410651383,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.0285803410651383
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.028237769422085335,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.028237769422085335
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5493827160493827,
"acc_stderr": 0.027684721415656203,
"acc_norm": 0.5493827160493827,
"acc_norm_stderr": 0.027684721415656203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0286638201471995,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0286638201471995
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854931,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150117,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.03445789964362749,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.03445789964362749
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4175999088658601,
"mc2_stderr": 0.013885459573047185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged | 2023-09-23T18:26:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of minlik/chinese-alpaca-33b-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [minlik/chinese-alpaca-33b-merged](https://huggingface.co/minlik/chinese-alpaca-33b-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T18:26:45.770833](https://huggingface.co/datasets/open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged/blob/main/results_2023-09-23T18-26-45.770833.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.34375,\n \
\ \"em_stderr\": 0.004864023482291936,\n \"f1\": 0.39666317114094085,\n\
\ \"f1_stderr\": 0.00476283705283174,\n \"acc\": 0.4206081596579169,\n\
\ \"acc_stderr\": 0.00973840020904149\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.34375,\n \"em_stderr\": 0.004864023482291936,\n \
\ \"f1\": 0.39666317114094085,\n \"f1_stderr\": 0.00476283705283174\n \
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \
\ \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\": {\n\
\ \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843902\n\
\ }\n}\n```"
repo_url: https://huggingface.co/minlik/chinese-alpaca-33b-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T18_26_45.770833
path:
- '**/details_harness|drop|3_2023-09-23T18-26-45.770833.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T18-26-45.770833.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T18_26_45.770833
path:
- '**/details_harness|gsm8k|5_2023-09-23T18-26-45.770833.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T18-26-45.770833.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:44:03.944390.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:44:03.944390.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T18_26_45.770833
path:
- '**/details_harness|winogrande|5_2023-09-23T18-26-45.770833.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T18-26-45.770833.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_44_03.944390
path:
- results_2023-08-18T14:44:03.944390.parquet
- split: 2023_09_23T18_26_45.770833
path:
- results_2023-09-23T18-26-45.770833.parquet
- split: latest
path:
- results_2023-09-23T18-26-45.770833.parquet
---
# Dataset Card for Evaluation run of minlik/chinese-alpaca-33b-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/minlik/chinese-alpaca-33b-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [minlik/chinese-alpaca-33b-merged](https://huggingface.co/minlik/chinese-alpaca-33b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T18:26:45.770833](https://huggingface.co/datasets/open-llm-leaderboard/details_minlik__chinese-alpaca-33b-merged/blob/main/results_2023-09-23T18-26-45.770833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.34375,
"em_stderr": 0.004864023482291936,
"f1": 0.39666317114094085,
"f1_stderr": 0.00476283705283174,
"acc": 0.4206081596579169,
"acc_stderr": 0.00973840020904149
},
"harness|drop|3": {
"em": 0.34375,
"em_stderr": 0.004864023482291936,
"f1": 0.39666317114094085,
"f1_stderr": 0.00476283705283174
},
"harness|gsm8k|5": {
"acc": 0.0803639120545868,
"acc_stderr": 0.007488258573239077
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843902
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca | 2023-09-22T17:02:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bertin-project/bertin-gpt-j-6B-alpaca](https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T17:02:02.199354](https://huggingface.co/datasets/open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca/blob/main/results_2023-09-22T17-02-02.199354.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016568791946308725,\n\
\ \"em_stderr\": 0.0013072452323527502,\n \"f1\": 0.07589660234899354,\n\
\ \"f1_stderr\": 0.0018842940437008274,\n \"acc\": 0.27900552486187846,\n\
\ \"acc_stderr\": 0.006978792039554494\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.016568791946308725,\n \"em_stderr\": 0.0013072452323527502,\n\
\ \"f1\": 0.07589660234899354,\n \"f1_stderr\": 0.0018842940437008274\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5580110497237569,\n\
\ \"acc_stderr\": 0.013957584079108989\n }\n}\n```"
repo_url: https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T17_02_02.199354
path:
- '**/details_harness|drop|3_2023-09-22T17-02-02.199354.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T17-02-02.199354.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T17_02_02.199354
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-02-02.199354.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-02-02.199354.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T17_02_02.199354
path:
- '**/details_harness|winogrande|5_2023-09-22T17-02-02.199354.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T17-02-02.199354.parquet'
- config_name: results
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- results_2023-08-17T15:41:33.782681.parquet
- split: 2023_09_22T17_02_02.199354
path:
- results_2023-09-22T17-02-02.199354.parquet
- split: latest
path:
- results_2023-09-22T17-02-02.199354.parquet
---
# Dataset Card for Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bertin-project/bertin-gpt-j-6B-alpaca](https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:02:02.199354](https://huggingface.co/datasets/open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca/blob/main/results_2023-09-22T17-02-02.199354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.016568791946308725,
"em_stderr": 0.0013072452323527502,
"f1": 0.07589660234899354,
"f1_stderr": 0.0018842940437008274,
"acc": 0.27900552486187846,
"acc_stderr": 0.006978792039554494
},
"harness|drop|3": {
"em": 0.016568791946308725,
"em_stderr": 0.0013072452323527502,
"f1": 0.07589660234899354,
"f1_stderr": 0.0018842940437008274
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5580110497237569,
"acc_stderr": 0.013957584079108989
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NotSharpe/MidwxstRVC | 2023-08-18T19:12:23.000Z | [
"license:openrail",
"region:us"
] | NotSharpe | null | null | null | 0 | 0 | ---
license: openrail
---
|
AI-C/lora-clone | 2023-08-18T19:12:39.000Z | [
"region:us"
] | AI-C | null | null | null | 0 | 0 | Entry not found |
pixelpandacreative/resumes | 2023-08-18T19:20:46.000Z | [
"task_categories:token-classification",
"task_categories:table-question-answering",
"size_categories:10K<n<100K",
"license:openrail",
"region:us"
] | pixelpandacreative | null | null | null | 0 | 0 | ---
license: openrail
task_categories:
- token-classification
- table-question-answering
pretty_name: Resumes
size_categories:
- 10K<n<100K
--- |
shinyruosuki/lora | 2023-08-18T19:51:52.000Z | [
"license:other",
"region:us"
] | shinyruosuki | null | null | null | 0 | 0 | ---
license: other
---
|
semoga33434/kencut | 2023-08-18T20:00:45.000Z | [
"region:us"
] | semoga33434 | null | null | null | 0 | 0 | Entry not found |
Shafagh/aya_persian_instruction_pn-summary | 2023-08-18T20:00:39.000Z | [
"region:us"
] | Shafagh | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 275447917
num_examples: 82022
- name: validation
num_bytes: 18998581
num_examples: 5592
- name: test
num_bytes: 18622744
num_examples: 5593
download_size: 142375579
dataset_size: 313069242
---
# Dataset Card for "aya_persian_instruction_pn-summary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Thunderbird_BERT_Finetuned | 2023-08-23T03:56:46.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115576729.6875
num_examples: 37500
- name: test
num_bytes: 38525577.5
num_examples: 12500
download_size: 211881253
dataset_size: 154102307.1875
---
# Dataset Card for "Thunderbird_BERT_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/code_instructions_122k_alpaca_style_standardized | 2023-08-18T20:06:08.000Z | [
"region:us"
] | HydraLM | null | null | null | 0 | 0 | Entry not found |
kranthigv/databricks-dolly-15k_standardized | 2023-08-18T20:12:41.000Z | [
"region:us"
] | kranthigv | null | null | null | 0 | 0 | Entry not found |
EgilKarlsen/Thunderbird_RoBERTa_Finetuned | 2023-08-23T04:05:06.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115576729.6875
num_examples: 37500
- name: test
num_bytes: 38525577.5
num_examples: 12500
download_size: 211881241
dataset_size: 154102307.1875
---
# Dataset Card for "Thunderbird_RoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/databricks-dolly-15k_standardized | 2023-08-18T20:13:17.000Z | [
"region:us"
] | HydraLM | null | null | null | 0 | 0 | Entry not found |
EgilKarlsen/Thunderbird_DistilRoBERTa_Finetuned | 2023-08-23T04:12:34.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115576729.6875
num_examples: 37500
- name: test
num_bytes: 38525577.5
num_examples: 12500
download_size: 211881255
dataset_size: 154102307.1875
---
# Dataset Card for "Thunderbird_DistilRoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Thunderbird_GPT2_Finetuned | 2023-08-23T04:20:47.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115576729.6875
num_examples: 37500
- name: test
num_bytes: 38525577.5
num_examples: 12500
download_size: 211858414
dataset_size: 154102307.1875
---
# Dataset Card for "Thunderbird_GPT2_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/Thunderbird_GPTNEO_Finetuned | 2023-08-23T05:02:52.000Z | [
"region:us"
] | EgilKarlsen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 307576729.6875
num_examples: 37500
- name: test
num_bytes: 102525577.5
num_examples: 12500
download_size: 565394599
dataset_size: 410102307.1875
---
# Dataset Card for "Thunderbird_GPTNEO_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cacheop/red-right-hand | 2023-08-18T21:44:01.000Z | [
"license:other",
"region:us"
] | cacheop | null | null | null | 0 | 0 | ---
license: other
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 711683
num_examples: 315
download_size: 338831
dataset_size: 711683
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jing24/seperate_0 | 2023-08-18T21:39:18.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 8063353
num_examples: 9208
download_size: 1455012
dataset_size: 8063353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_1 | 2023-08-18T21:39:21.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 7568218
num_examples: 8021
download_size: 1401063
dataset_size: 7568218
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_2 | 2023-08-18T21:39:23.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 6921907
num_examples: 7848
download_size: 1327593
dataset_size: 6921907
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_3 | 2023-08-18T21:39:24.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 6880782
num_examples: 7720
download_size: 1220030
dataset_size: 6880782
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_4 | 2023-08-18T21:39:27.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 7565982
num_examples: 8162
download_size: 1382972
dataset_size: 7565982
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_5 | 2023-08-18T21:39:29.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 8027732
num_examples: 8533
download_size: 1385199
dataset_size: 8027732
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_6 | 2023-08-18T21:39:31.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 7073760
num_examples: 7809
download_size: 1306657
dataset_size: 7073760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_7 | 2023-08-18T21:39:33.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 6301415
num_examples: 6897
download_size: 1157468
dataset_size: 6301415
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_8 | 2023-08-18T21:39:35.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 7635652
num_examples: 8731
download_size: 1353779
dataset_size: 7635652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_9 | 2023-08-18T21:39:37.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 6039075
num_examples: 6803
download_size: 1167345
dataset_size: 6039075
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_10 | 2023-08-18T21:39:39.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 6991061
num_examples: 7503
download_size: 1295926
dataset_size: 6991061
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_all0 | 2023-08-18T21:41:53.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 8063353
num_examples: 9208
download_size: 1455012
dataset_size: 8063353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_all0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_all_sub0 | 2023-08-18T21:41:56.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 71282755
num_examples: 78391
download_size: 13012921
dataset_size: 71282755
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_all_sub0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/seperate_all1 | 2023-08-18T21:41:58.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 15631571
num_examples: 17229
download_size: 2844837
dataset_size: 15631571
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_all1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.