id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_kittn__mistral-7B-v0.1-hf | 2023-10-03T19:51:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of kittn/mistral-7B-v0.1-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kittn/mistral-7B-v0.1-hf](https://huggingface.co/kittn/mistral-7B-v0.1-hf) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kittn__mistral-7B-v0.1-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:50:01.602909](https://huggingface.co/datasets/open-llm-leaderboard/details_kittn__mistral-7B-v0.1-hf/blob/main/results_2023-10-03T19-50-01.602909.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6386756847072136,\n\
\ \"acc_stderr\": 0.03291876255893497,\n \"acc_norm\": 0.6426937138267004,\n\
\ \"acc_norm_stderr\": 0.03289725744813461,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4212005766948656,\n\
\ \"mc2_stderr\": 0.01414122762052897\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137996,\n\
\ \"acc_norm\": 0.6023890784982935,\n \"acc_norm_stderr\": 0.014301752223279542\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6287592113124876,\n\
\ \"acc_stderr\": 0.004821492994082127,\n \"acc_norm\": 0.8333997211710814,\n\
\ \"acc_norm_stderr\": 0.003718570792719566\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391545,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391545\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.033888571185023246,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.033888571185023246\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\
\ \"acc_stderr\": 0.015652542496421118,\n \"acc_norm\": 0.3240223463687151,\n\
\ \"acc_norm_stderr\": 0.015652542496421118\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046102,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046102\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4212005766948656,\n\
\ \"mc2_stderr\": 0.01414122762052897\n }\n}\n```"
repo_url: https://huggingface.co/kittn/mistral-7B-v0.1-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-42-22.443456.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-50-01.602909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-50-01.602909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-42-22.443456.parquet'
- split: 2023_10_03T19_50_01.602909
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-50-01.602909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-50-01.602909.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_42_22.443456
path:
- results_2023-10-03T19-42-22.443456.parquet
- split: 2023_10_03T19_50_01.602909
path:
- results_2023-10-03T19-50-01.602909.parquet
- split: latest
path:
- results_2023-10-03T19-50-01.602909.parquet
---
# Dataset Card for Evaluation run of kittn/mistral-7B-v0.1-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kittn/mistral-7B-v0.1-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kittn/mistral-7B-v0.1-hf](https://huggingface.co/kittn/mistral-7B-v0.1-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kittn__mistral-7B-v0.1-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:50:01.602909](https://huggingface.co/datasets/open-llm-leaderboard/details_kittn__mistral-7B-v0.1-hf/blob/main/results_2023-10-03T19-50-01.602909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6386756847072136,
"acc_stderr": 0.03291876255893497,
"acc_norm": 0.6426937138267004,
"acc_norm_stderr": 0.03289725744813461,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4212005766948656,
"mc2_stderr": 0.01414122762052897
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137996,
"acc_norm": 0.6023890784982935,
"acc_norm_stderr": 0.014301752223279542
},
"harness|hellaswag|10": {
"acc": 0.6287592113124876,
"acc_stderr": 0.004821492994082127,
"acc_norm": 0.8333997211710814,
"acc_norm_stderr": 0.003718570792719566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391545,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391545
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.033888571185023246,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.033888571185023246
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.015652542496421118,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.015652542496421118
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046102,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4212005766948656,
"mc2_stderr": 0.01414122762052897
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TSHMatheus/Tim-Defense | 2023-10-04T15:11:25.000Z | [
"license:unknown",
"region:us"
] | TSHMatheus | null | null | null | 0 | 0 | ---
license: unknown
---
|
open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0 | 2023-10-03T19:48:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of meta-math/MetaMath-13B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [meta-math/MetaMath-13B-V1.0](https://huggingface.co/meta-math/MetaMath-13B-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:47:07.095350](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0/blob/main/results_2023-10-03T19-47-07.095350.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47912070413560215,\n\
\ \"acc_stderr\": 0.03491107886872343,\n \"acc_norm\": 0.48260227232839065,\n\
\ \"acc_norm_stderr\": 0.03490008819568503,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.01576477083677731,\n \"mc2\": 0.41575339609808976,\n\
\ \"mc2_stderr\": 0.01560446973515796\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4667235494880546,\n \"acc_stderr\": 0.014578995859605811,\n\
\ \"acc_norm\": 0.4948805460750853,\n \"acc_norm_stderr\": 0.014610624890309157\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5875323640709023,\n\
\ \"acc_stderr\": 0.004912723848944791,\n \"acc_norm\": 0.7647878908583947,\n\
\ \"acc_norm_stderr\": 0.004232645108976139\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934265,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934265\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5548387096774193,\n\
\ \"acc_stderr\": 0.028272410186214906,\n \"acc_norm\": 0.5548387096774193,\n\
\ \"acc_norm_stderr\": 0.028272410186214906\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.03422398565657551,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.03422398565657551\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.0251891498947642,\n \
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.0251891498947642\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.671559633027523,\n \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\"\
: 0.671559633027523,\n \"acc_norm_stderr\": 0.02013590279729841\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n\
\ \"acc_stderr\": 0.03256850570293647,\n \"acc_norm\": 0.35185185185185186,\n\
\ \"acc_norm_stderr\": 0.03256850570293647\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6160337552742616,\n \"acc_stderr\": 0.031658678064106674,\n \
\ \"acc_norm\": 0.6160337552742616,\n \"acc_norm_stderr\": 0.031658678064106674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.0332319730294294,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.0332319730294294\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.04385162325601553,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.04385162325601553\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978814,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978814\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.039265223787088424,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.039265223787088424\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.029202540153431166,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.029202540153431166\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.01661750173876338,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.01661750173876338\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.02678881193156275,\n\
\ \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.02678881193156275\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30726256983240224,\n\
\ \"acc_stderr\": 0.015430158846469621,\n \"acc_norm\": 0.30726256983240224,\n\
\ \"acc_norm_stderr\": 0.015430158846469621\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4738562091503268,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.028274359854894248,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.028274359854894248\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.333116036505867,\n\
\ \"acc_stderr\": 0.012037930451512054,\n \"acc_norm\": 0.333116036505867,\n\
\ \"acc_norm_stderr\": 0.012037930451512054\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.434640522875817,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.01576477083677731,\n \"mc2\": 0.41575339609808976,\n\
\ \"mc2_stderr\": 0.01560446973515796\n }\n}\n```"
repo_url: https://huggingface.co/meta-math/MetaMath-13B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-07.095350.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- results_2023-10-03T19-47-07.095350.parquet
- split: latest
path:
- results_2023-10-03T19-47-07.095350.parquet
---
# Dataset Card for Evaluation run of meta-math/MetaMath-13B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/meta-math/MetaMath-13B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [meta-math/MetaMath-13B-V1.0](https://huggingface.co/meta-math/MetaMath-13B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:47:07.095350](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0/blob/main/results_2023-10-03T19-47-07.095350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47912070413560215,
"acc_stderr": 0.03491107886872343,
"acc_norm": 0.48260227232839065,
"acc_norm_stderr": 0.03490008819568503,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.01576477083677731,
"mc2": 0.41575339609808976,
"mc2_stderr": 0.01560446973515796
},
"harness|arc:challenge|25": {
"acc": 0.4667235494880546,
"acc_stderr": 0.014578995859605811,
"acc_norm": 0.4948805460750853,
"acc_norm_stderr": 0.014610624890309157
},
"harness|hellaswag|10": {
"acc": 0.5875323640709023,
"acc_stderr": 0.004912723848944791,
"acc_norm": 0.7647878908583947,
"acc_norm_stderr": 0.004232645108976139
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934265,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934265
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5548387096774193,
"acc_stderr": 0.028272410186214906,
"acc_norm": 0.5548387096774193,
"acc_norm_stderr": 0.028272410186214906
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.0251891498947642,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.0251891498947642
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6160337552742616,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.6160337552742616,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.0332319730294294,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.0332319730294294
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.04385162325601553,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.04385162325601553
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978814,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978814
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.029202540153431166,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.029202540153431166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.01661750173876338,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.01661750173876338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.02678881193156275,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.02678881193156275
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30726256983240224,
"acc_stderr": 0.015430158846469621,
"acc_norm": 0.30726256983240224,
"acc_norm_stderr": 0.015430158846469621
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.028274359854894248,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.028274359854894248
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.333116036505867,
"acc_stderr": 0.012037930451512054,
"acc_norm": 0.333116036505867,
"acc_norm_stderr": 0.012037930451512054
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268813,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268813
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.01576477083677731,
"mc2": 0.41575339609808976,
"mc2_stderr": 0.01560446973515796
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
J00rge/OrquestraCid | 2023-10-03T20:01:54.000Z | [
"license:openrail",
"region:us"
] | J00rge | null | null | null | 0 | 0 | ---
license: openrail
---
|
awacke1/Test | 2023-10-03T19:57:20.000Z | [
"region:us"
] | awacke1 | null | null | null | 0 | 0 | Entry not found |
zeyuanyin/SRe2L | 2023-10-10T16:50:37.000Z | [
"license:mit",
"region:us"
] | zeyuanyin | null | null | null | 0 | 0 | ---
license: mit
---
|
razhan/diyako_hashemi_yt | 2023-10-03T21:28:48.000Z | [
"region:us"
] | razhan | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 8024831110.886
num_examples: 24207
download_size: 6774073877
dataset_size: 8024831110.886
---
# Dataset Card for "diyako_hashemi_yt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atom-in-the-universe/bild-03a93168-1d45-4a39-952a-0f20ab60ab28 | 2023-10-03T20:56:30.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-4e9f68b6-5eec-4c48-ac43-b6e94650840b | 2023-10-03T20:58:36.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-dd1f08b7-dfa2-4a82-b3ff-decf8e8c510b | 2023-10-03T20:59:25.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-242a8496-bb31-4adf-9cb4-fb1aa0f25213 | 2023-10-03T21:01:45.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_compare8k2 | 2023-10-03T21:02:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wei123602/Llama-2-13b-FINETUNE4_compare8k2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/Llama-2-13b-FINETUNE4_compare8k2](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_compare8k2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_compare8k2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T21:01:32.366658](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_compare8k2/blob/main/results_2023-10-03T21-01-32.366658.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5687964083352047,\n\
\ \"acc_stderr\": 0.03428422878368393,\n \"acc_norm\": 0.5730564504442265,\n\
\ \"acc_norm_stderr\": 0.034264760052846295,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024654,\n \"mc2\": 0.3985894900243793,\n\
\ \"mc2_stderr\": 0.014339806253558348\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636586,\n\
\ \"acc_norm\": 0.5827645051194539,\n \"acc_norm_stderr\": 0.014409825518403077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6060545708026289,\n\
\ \"acc_stderr\": 0.004876243842318609,\n \"acc_norm\": 0.8138816968731328,\n\
\ \"acc_norm_stderr\": 0.0038840668811314745\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347364,\n \"\
acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347364\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767755,\n\
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767755\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630793,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630793\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716323,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716323\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905706,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905706\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306376,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306376\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.014987325439963554,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.014987325439963554\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.02685882587948854,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.02685882587948854\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037082,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037082\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777515,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935559,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935559\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024654,\n \"mc2\": 0.3985894900243793,\n\
\ \"mc2_stderr\": 0.014339806253558348\n }\n}\n```"
repo_url: https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_compare8k2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|arc:challenge|25_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hellaswag|10_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-01-32.366658.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-01-32.366658.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T21-01-32.366658.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T21-01-32.366658.parquet'
- config_name: results
data_files:
- split: 2023_10_03T21_01_32.366658
path:
- results_2023-10-03T21-01-32.366658.parquet
- split: latest
path:
- results_2023-10-03T21-01-32.366658.parquet
---
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_compare8k2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_compare8k2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4_compare8k2](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_compare8k2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_compare8k2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T21:01:32.366658](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_compare8k2/blob/main/results_2023-10-03T21-01-32.366658.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5687964083352047,
"acc_stderr": 0.03428422878368393,
"acc_norm": 0.5730564504442265,
"acc_norm_stderr": 0.034264760052846295,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024654,
"mc2": 0.3985894900243793,
"mc2_stderr": 0.014339806253558348
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636586,
"acc_norm": 0.5827645051194539,
"acc_norm_stderr": 0.014409825518403077
},
"harness|hellaswag|10": {
"acc": 0.6060545708026289,
"acc_stderr": 0.004876243842318609,
"acc_norm": 0.8138816968731328,
"acc_norm_stderr": 0.0038840668811314745
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347364,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347364
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624528,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.024864995159767755,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.024864995159767755
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630793,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630793
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716323,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716323
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.044939490686135404,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.044939490686135404
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905706,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905706
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306376,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306376
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963554,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963554
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948854,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948854
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037082,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037082
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777515,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935559,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935559
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024654,
"mc2": 0.3985894900243793,
"mc2_stderr": 0.014339806253558348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-c4528fcd-c139-4628-97a9-46672fd84728 | 2023-10-03T21:02:46.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-23aaa821-2610-4a12-939b-b7f094530595 | 2023-10-03T21:09:04.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-a6333777-4a4c-4c6c-b409-8276ca157090 | 2023-10-03T21:10:20.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-108ec46f-0bf8-4a4d-a202-062f187838e3 | 2023-10-03T21:10:57.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-cae2910c-9592-4a41-af97-2bb0a78b49b1 | 2023-10-03T21:14:12.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-c0fe0291-95f0-472b-95f1-4ae46befc9a9 | 2023-10-03T21:42:28.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
gjoy/validatellm | 2023-10-03T21:34:55.000Z | [
"region:us"
] | gjoy | null | null | null | 0 | 0 | Entry not found |
salsarra/AQAD_SPLIT | 2023-10-03T21:37:56.000Z | [
"region:us"
] | salsarra | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__Guanaco-3B-Uncensored-v2-GPTQ | 2023-10-03T21:40:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Guanaco-3B-Uncensored-v2-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Guanaco-3B-Uncensored-v2-GPTQ](https://huggingface.co/TheBloke/Guanaco-3B-Uncensored-v2-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Guanaco-3B-Uncensored-v2-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T21:39:11.409465](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Guanaco-3B-Uncensored-v2-GPTQ/blob/main/results_2023-10-03T21-39-11.409465.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26796194534851764,\n\
\ \"acc_stderr\": 0.0320582805520218,\n \"acc_norm\": 0.27162906211374094,\n\
\ \"acc_norm_stderr\": 0.032059588358959924,\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506978,\n \"mc2\": 0.3658408497762684,\n\
\ \"mc2_stderr\": 0.013884287044021056\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3703071672354949,\n \"acc_stderr\": 0.014111298751674948,\n\
\ \"acc_norm\": 0.41638225255972694,\n \"acc_norm_stderr\": 0.014405618279436172\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4772953594901414,\n\
\ \"acc_stderr\": 0.00498463428510162,\n \"acc_norm\": 0.6475801633140809,\n\
\ \"acc_norm_stderr\": 0.004767475366689784\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.13725490196078433,\n \"acc_stderr\": 0.034240846698915216,\n\
\ \"acc_norm\": 0.13725490196078433,\n \"acc_norm_stderr\": 0.034240846698915216\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.19574468085106383,\n \"acc_stderr\": 0.025937853139977148,\n\
\ \"acc_norm\": 0.19574468085106383,\n \"acc_norm_stderr\": 0.025937853139977148\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.03664666337225256,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.03664666337225256\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.03090379695211449,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.03090379695211449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2794871794871795,\n \"acc_stderr\": 0.02275238883977683,\n \
\ \"acc_norm\": 0.2794871794871795,\n \"acc_norm_stderr\": 0.02275238883977683\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26788990825688075,\n\
\ \"acc_stderr\": 0.01898746225797865,\n \"acc_norm\": 0.26788990825688075,\n\
\ \"acc_norm_stderr\": 0.01898746225797865\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n\
\ \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.0462028408228004,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.0462028408228004\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094455,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094455\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.015491088951494576,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.015491088951494576\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261466,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261466\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879905,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2711864406779661,\n\
\ \"acc_stderr\": 0.011354581451622985,\n \"acc_norm\": 0.2711864406779661,\n\
\ \"acc_norm_stderr\": 0.011354581451622985\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.022770868010113025,\n\
\ \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.022770868010113025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23202614379084968,\n \"acc_stderr\": 0.017077373377857006,\n \
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.017077373377857006\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2571428571428571,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.2571428571428571,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506978,\n \"mc2\": 0.3658408497762684,\n\
\ \"mc2_stderr\": 0.013884287044021056\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Guanaco-3B-Uncensored-v2-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|arc:challenge|25_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hellaswag|10_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-39-11.409465.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T21-39-11.409465.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T21-39-11.409465.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T21-39-11.409465.parquet'
- config_name: results
data_files:
- split: 2023_10_03T21_39_11.409465
path:
- results_2023-10-03T21-39-11.409465.parquet
- split: latest
path:
- results_2023-10-03T21-39-11.409465.parquet
---
# Dataset Card for Evaluation run of TheBloke/Guanaco-3B-Uncensored-v2-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Guanaco-3B-Uncensored-v2-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Guanaco-3B-Uncensored-v2-GPTQ](https://huggingface.co/TheBloke/Guanaco-3B-Uncensored-v2-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Guanaco-3B-Uncensored-v2-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T21:39:11.409465](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Guanaco-3B-Uncensored-v2-GPTQ/blob/main/results_2023-10-03T21-39-11.409465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26796194534851764,
"acc_stderr": 0.0320582805520218,
"acc_norm": 0.27162906211374094,
"acc_norm_stderr": 0.032059588358959924,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506978,
"mc2": 0.3658408497762684,
"mc2_stderr": 0.013884287044021056
},
"harness|arc:challenge|25": {
"acc": 0.3703071672354949,
"acc_stderr": 0.014111298751674948,
"acc_norm": 0.41638225255972694,
"acc_norm_stderr": 0.014405618279436172
},
"harness|hellaswag|10": {
"acc": 0.4772953594901414,
"acc_stderr": 0.00498463428510162,
"acc_norm": 0.6475801633140809,
"acc_norm_stderr": 0.004767475366689784
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.13725490196078433,
"acc_stderr": 0.034240846698915216,
"acc_norm": 0.13725490196078433,
"acc_norm_stderr": 0.034240846698915216
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19574468085106383,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.19574468085106383,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.03664666337225256,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.03664666337225256
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.03090379695211449,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.03090379695211449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2794871794871795,
"acc_stderr": 0.02275238883977683,
"acc_norm": 0.2794871794871795,
"acc_norm_stderr": 0.02275238883977683
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.01898746225797865,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.01898746225797865
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.0462028408228004,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.0462028408228004
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094455,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094455
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.015491088951494576,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.015491088951494576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28034682080924855,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.28034682080924855,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261466,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261466
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590627,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2711864406779661,
"acc_stderr": 0.011354581451622985,
"acc_norm": 0.2711864406779661,
"acc_norm_stderr": 0.011354581451622985
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16911764705882354,
"acc_stderr": 0.022770868010113025,
"acc_norm": 0.16911764705882354,
"acc_norm_stderr": 0.022770868010113025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.017077373377857006,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.017077373377857006
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2571428571428571,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.2571428571428571,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553026,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553026
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506978,
"mc2": 0.3658408497762684,
"mc2_stderr": 0.013884287044021056
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-21cbae51-28f5-44aa-835a-dc3be71589e4 | 2023-10-03T21:56:38.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
Sunnytim/Story | 2023-10-03T21:51:19.000Z | [
"license:apache-2.0",
"region:us"
] | Sunnytim | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
atom-in-the-universe/bild-21fdb3c7-5474-46a6-9045-0630138fdab8 | 2023-10-03T22:09:35.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
gvst/phil-bench | 2023-10-04T00:15:32.000Z | [
"license:mit",
"region:us"
] | gvst | null | null | null | 0 | 0 | ---
license: mit
---
|
atom-in-the-universe/bild-fd1fd086-5b21-4ef6-b75e-80e743e53e85 | 2023-10-03T22:24:48.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_WizardLM__WizardCoder-Python-7B-V1.0 | 2023-10-03T22:14:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of WizardLM/WizardCoder-Python-7B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WizardLM/WizardCoder-Python-7B-V1.0](https://huggingface.co/WizardLM/WizardCoder-Python-7B-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardCoder-Python-7B-V1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T22:13:00.880778](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardCoder-Python-7B-V1.0/blob/main/results_2023-10-03T22-13-00.880778.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.32702069786637533,\n\
\ \"acc_stderr\": 0.03390524920155841,\n \"acc_norm\": 0.33007114162819035,\n\
\ \"acc_norm_stderr\": 0.0339041485500483,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752332,\n \"mc2\": 0.3631503459535805,\n\
\ \"mc2_stderr\": 0.014229199741184343\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3890784982935154,\n \"acc_stderr\": 0.014247309976045607,\n\
\ \"acc_norm\": 0.4180887372013652,\n \"acc_norm_stderr\": 0.014413988396996072\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49960167297351127,\n\
\ \"acc_stderr\": 0.004989779828043845,\n \"acc_norm\": 0.6505676160127465,\n\
\ \"acc_norm_stderr\": 0.004758162967997396\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\
\ \"acc_stderr\": 0.035506839891655796,\n \"acc_norm\": 0.3179190751445087,\n\
\ \"acc_norm_stderr\": 0.035506839891655796\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412424,\n\
\ \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022057,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022057\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185554,\n\
\ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031726,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2838709677419355,\n \"acc_stderr\": 0.02564938106302926,\n \"\
acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.02564938106302926\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"\
acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.42424242424242425,\n \"acc_stderr\": 0.03859268142070262,\n\
\ \"acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.03859268142070262\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756775,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756775\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.0234009289183105,\n \
\ \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.0234009289183105\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.18888888888888888,\n \"acc_stderr\": 0.023865318862285326,\n \
\ \"acc_norm\": 0.18888888888888888,\n \"acc_norm_stderr\": 0.023865318862285326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869666,\n \"\
acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869666\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4068627450980392,\n \"acc_stderr\": 0.03447891136353382,\n \"\
acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.41350210970464135,\n \"acc_stderr\": 0.03205649904851858,\n \
\ \"acc_norm\": 0.41350210970464135,\n \"acc_norm_stderr\": 0.03205649904851858\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.042943408452120954,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.042943408452120954\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.45726495726495725,\n\
\ \"acc_stderr\": 0.03263622596380688,\n \"acc_norm\": 0.45726495726495725,\n\
\ \"acc_norm_stderr\": 0.03263622596380688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.41762452107279696,\n \"acc_stderr\": 0.01763563732695152,\n\
\ \"acc_norm\": 0.41762452107279696,\n \"acc_norm_stderr\": 0.01763563732695152\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.025992472029306383,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.025992472029306383\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.26927374301675977,\n \"acc_stderr\": 0.014835616582882611,\n\
\ \"acc_norm\": 0.26927374301675977,\n \"acc_norm_stderr\": 0.014835616582882611\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3202614379084967,\n\
\ \"acc_stderr\": 0.026716118380156837,\n \"acc_norm\": 0.3202614379084967,\n\
\ \"acc_norm_stderr\": 0.026716118380156837\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.3729903536977492,\n \"acc_stderr\": 0.027466610213140116,\n\
\ \"acc_norm\": 0.3729903536977492,\n \"acc_norm_stderr\": 0.027466610213140116\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.02584224870090217,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.02584224870090217\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460987,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460987\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30638852672750977,\n\
\ \"acc_stderr\": 0.011773980329380727,\n \"acc_norm\": 0.30638852672750977,\n\
\ \"acc_norm_stderr\": 0.011773980329380727\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3382352941176471,\n \"acc_stderr\": 0.019139943748487046,\n \
\ \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.019139943748487046\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3681592039800995,\n\
\ \"acc_stderr\": 0.03410410565495302,\n \"acc_norm\": 0.3681592039800995,\n\
\ \"acc_norm_stderr\": 0.03410410565495302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.03629335329947859,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.03629335329947859\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529917,\n\
\ \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529917\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752332,\n \"mc2\": 0.3631503459535805,\n\
\ \"mc2_stderr\": 0.014229199741184343\n }\n}\n```"
repo_url: https://huggingface.co/WizardLM/WizardCoder-Python-7B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-13-00.880778.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-13-00.880778.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-13-00.880778.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-13-00.880778.parquet'
- config_name: results
data_files:
- split: 2023_10_03T22_13_00.880778
path:
- results_2023-10-03T22-13-00.880778.parquet
- split: latest
path:
- results_2023-10-03T22-13-00.880778.parquet
---
# Dataset Card for Evaluation run of WizardLM/WizardCoder-Python-7B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardCoder-Python-7B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardCoder-Python-7B-V1.0](https://huggingface.co/WizardLM/WizardCoder-Python-7B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardCoder-Python-7B-V1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T22:13:00.880778](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardCoder-Python-7B-V1.0/blob/main/results_2023-10-03T22-13-00.880778.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.32702069786637533,
"acc_stderr": 0.03390524920155841,
"acc_norm": 0.33007114162819035,
"acc_norm_stderr": 0.0339041485500483,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752332,
"mc2": 0.3631503459535805,
"mc2_stderr": 0.014229199741184343
},
"harness|arc:challenge|25": {
"acc": 0.3890784982935154,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.4180887372013652,
"acc_norm_stderr": 0.014413988396996072
},
"harness|hellaswag|10": {
"acc": 0.49960167297351127,
"acc_stderr": 0.004989779828043845,
"acc_norm": 0.6505676160127465,
"acc_norm_stderr": 0.004758162967997396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03745554791462457,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03745554791462457
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.035506839891655796,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.035506839891655796
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412424,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022057,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022057
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185554,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031726,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.03859268142070262,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.03859268142070262
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.0234009289183105,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.0234009289183105
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.18888888888888888,
"acc_stderr": 0.023865318862285326,
"acc_norm": 0.18888888888888888,
"acc_norm_stderr": 0.023865318862285326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.41350210970464135,
"acc_stderr": 0.03205649904851858,
"acc_norm": 0.41350210970464135,
"acc_norm_stderr": 0.03205649904851858
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.042943408452120954,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.042943408452120954
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.45726495726495725,
"acc_stderr": 0.03263622596380688,
"acc_norm": 0.45726495726495725,
"acc_norm_stderr": 0.03263622596380688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.41762452107279696,
"acc_stderr": 0.01763563732695152,
"acc_norm": 0.41762452107279696,
"acc_norm_stderr": 0.01763563732695152
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.025992472029306383,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.025992472029306383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26927374301675977,
"acc_stderr": 0.014835616582882611,
"acc_norm": 0.26927374301675977,
"acc_norm_stderr": 0.014835616582882611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3202614379084967,
"acc_stderr": 0.026716118380156837,
"acc_norm": 0.3202614379084967,
"acc_norm_stderr": 0.026716118380156837
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3729903536977492,
"acc_stderr": 0.027466610213140116,
"acc_norm": 0.3729903536977492,
"acc_norm_stderr": 0.027466610213140116
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460987,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460987
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30638852672750977,
"acc_stderr": 0.011773980329380727,
"acc_norm": 0.30638852672750977,
"acc_norm_stderr": 0.011773980329380727
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.019139943748487046,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.019139943748487046
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3681592039800995,
"acc_stderr": 0.03410410565495302,
"acc_norm": 0.3681592039800995,
"acc_norm_stderr": 0.03410410565495302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947859,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947859
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4269005847953216,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.4269005847953216,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752332,
"mc2": 0.3631503459535805,
"mc2_stderr": 0.014229199741184343
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gjoy/csv_from_llama_base | 2023-10-03T22:17:20.000Z | [
"region:us"
] | gjoy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: study_id
dtype: int64
- name: cardiac_mri_report
dtype: string
- name: Extracted_Left_Atrial_Area
dtype: string
splits:
- name: train
num_bytes: 372425
num_examples: 200
download_size: 145523
dataset_size: 372425
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "csv_from_llama_base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_hpcai-tech__Colossal-LLaMA-2-7b-base | 2023-10-03T22:20:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of hpcai-tech/Colossal-LLaMA-2-7b-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hpcai-tech/Colossal-LLaMA-2-7b-base](https://huggingface.co/hpcai-tech/Colossal-LLaMA-2-7b-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hpcai-tech__Colossal-LLaMA-2-7b-base\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T22:19:26.323757](https://huggingface.co/datasets/open-llm-leaderboard/details_hpcai-tech__Colossal-LLaMA-2-7b-base/blob/main/results_2023-10-03T22-19-26.323757.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5427822903250497,\n\
\ \"acc_stderr\": 0.03477989064923685,\n \"acc_norm\": 0.5466244427509336,\n\
\ \"acc_norm_stderr\": 0.03477206124984878,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.5019124221833382,\n\
\ \"mc2_stderr\": 0.015294227791010091\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4863481228668942,\n \"acc_stderr\": 0.014605943429860947,\n\
\ \"acc_norm\": 0.5349829351535836,\n \"acc_norm_stderr\": 0.01457558392201967\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5269866560446126,\n\
\ \"acc_stderr\": 0.004982508198584269,\n \"acc_norm\": 0.7050388368850826,\n\
\ \"acc_norm_stderr\": 0.0045509331425287606\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.02432631052914913,\n \"acc_norm\"\
: 0.335978835978836,\n \"acc_norm_stderr\": 0.02432631052914913\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330876,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330876\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"\
acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178263,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178263\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736236,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736236\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7151979565772669,\n\
\ \"acc_stderr\": 0.016139174096522553,\n \"acc_norm\": 0.7151979565772669,\n\
\ \"acc_norm_stderr\": 0.016139174096522553\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220503,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220503\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.028180596328259283,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.028180596328259283\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925654,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940985,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940985\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n\
\ \"acc_stderr\": 0.01257969963128926,\n \"acc_norm\": 0.41395045632333766,\n\
\ \"acc_norm_stderr\": 0.01257969963128926\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014638,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014638\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.5019124221833382,\n\
\ \"mc2_stderr\": 0.015294227791010091\n }\n}\n```"
repo_url: https://huggingface.co/hpcai-tech/Colossal-LLaMA-2-7b-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-19-26.323757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-19-26.323757.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-19-26.323757.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-19-26.323757.parquet'
- config_name: results
data_files:
- split: 2023_10_03T22_19_26.323757
path:
- results_2023-10-03T22-19-26.323757.parquet
- split: latest
path:
- results_2023-10-03T22-19-26.323757.parquet
---
# Dataset Card for Evaluation run of hpcai-tech/Colossal-LLaMA-2-7b-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hpcai-tech/Colossal-LLaMA-2-7b-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hpcai-tech/Colossal-LLaMA-2-7b-base](https://huggingface.co/hpcai-tech/Colossal-LLaMA-2-7b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hpcai-tech__Colossal-LLaMA-2-7b-base",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T22:19:26.323757](https://huggingface.co/datasets/open-llm-leaderboard/details_hpcai-tech__Colossal-LLaMA-2-7b-base/blob/main/results_2023-10-03T22-19-26.323757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5427822903250497,
"acc_stderr": 0.03477989064923685,
"acc_norm": 0.5466244427509336,
"acc_norm_stderr": 0.03477206124984878,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.5019124221833382,
"mc2_stderr": 0.015294227791010091
},
"harness|arc:challenge|25": {
"acc": 0.4863481228668942,
"acc_stderr": 0.014605943429860947,
"acc_norm": 0.5349829351535836,
"acc_norm_stderr": 0.01457558392201967
},
"harness|hellaswag|10": {
"acc": 0.5269866560446126,
"acc_stderr": 0.004982508198584269,
"acc_norm": 0.7050388368850826,
"acc_norm_stderr": 0.0045509331425287606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.02432631052914913,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.02432631052914913
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330876,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330876
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736236,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.047500773411999854,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.047500773411999854
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922737,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922737
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7151979565772669,
"acc_stderr": 0.016139174096522553,
"acc_norm": 0.7151979565772669,
"acc_norm_stderr": 0.016139174096522553
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220503,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.028180596328259283,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.028180596328259283
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925654,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940985,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940985
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.01257969963128926,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.01257969963128926
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.02017061497496976,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.02017061497496976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.5019124221833382,
"mc2_stderr": 0.015294227791010091
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-8f5cd045-2bf0-4420-a3d7-bb167bd0cdfa | 2023-10-03T22:37:49.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
api-misuse/java_repo_star | 2023-10-03T22:29:41.000Z | [
"region:us"
] | api-misuse | null | null | null | 0 | 0 | Entry not found |
OldisGold/webis-touche2020-v3 | 2023-10-03T22:40:57.000Z | [
"license:apache-2.0",
"region:us"
] | OldisGold | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
atom-in-the-universe/bild-f0be7494-5b6f-4d41-a81d-29f7fbf8018f | 2023-10-03T22:50:39.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-158c08a0-542b-48f2-9d5f-6190505dc08f | 2023-10-03T23:05:53.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
warleagle/1t_chat_bot_data | 2023-10-03T23:06:35.000Z | [
"region:us"
] | warleagle | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 890558
num_examples: 2083
download_size: 398939
dataset_size: 890558
---
# Dataset Card for "1t_chat_bot_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_wtang06__mpt-125m-c4 | 2023-10-03T23:06:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of wtang06/mpt-125m-c4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wtang06/mpt-125m-c4](https://huggingface.co/wtang06/mpt-125m-c4) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wtang06__mpt-125m-c4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T23:04:52.221778](https://huggingface.co/datasets/open-llm-leaderboard/details_wtang06__mpt-125m-c4/blob/main/results_2023-10-03T23-04-52.221778.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2314240573187148,\n\
\ \"acc_stderr\": 0.03071122006512167,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n }\n}\n```"
repo_url: https://huggingface.co/wtang06/mpt-125m-c4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-04-52.221778.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-04-52.221778.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-04-52.221778.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-04-52.221778.parquet'
- config_name: results
data_files:
- split: 2023_10_03T23_04_52.221778
path:
- results_2023-10-03T23-04-52.221778.parquet
- split: latest
path:
- results_2023-10-03T23-04-52.221778.parquet
---
# Dataset Card for Evaluation run of wtang06/mpt-125m-c4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wtang06/mpt-125m-c4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wtang06/mpt-125m-c4](https://huggingface.co/wtang06/mpt-125m-c4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wtang06__mpt-125m-c4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T23:04:52.221778](https://huggingface.co/datasets/open-llm-leaderboard/details_wtang06__mpt-125m-c4/blob/main/results_2023-10-03T23-04-52.221778.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2314240573187148,
"acc_stderr": 0.03071122006512167,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-6b45b418-c163-46de-9dbe-25554c14df65 | 2023-10-03T23:19:31.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-cc892e44-1899-45be-9fb3-220cdff3f506 | 2023-10-03T23:33:12.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
marasama/nva-nijou | 2023-10-03T23:24:22.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
elsoberanox/new-character-moon | 2023-10-03T23:28:30.000Z | [
"region:us"
] | elsoberanox | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-aef45151-f8bf-48cc-b238-b403a956eb09 | 2023-10-03T23:47:46.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_lyogavin__Anima-7B-100K | 2023-10-03T23:38:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lyogavin/Anima-7B-100K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lyogavin/Anima-7B-100K](https://huggingface.co/lyogavin/Anima-7B-100K) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lyogavin__Anima-7B-100K\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T23:37:02.912169](https://huggingface.co/datasets/open-llm-leaderboard/details_lyogavin__Anima-7B-100K/blob/main/results_2023-10-03T23-37-02.912169.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33903815614723354,\n\
\ \"acc_stderr\": 0.03394335857311012,\n \"acc_norm\": 0.34277895834311756,\n\
\ \"acc_norm_stderr\": 0.03393565406853249,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731627,\n \"mc2\": 0.3784184150795517,\n\
\ \"mc2_stderr\": 0.01406913882378111\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.44283276450511944,\n \"acc_stderr\": 0.0145155738733489,\n\
\ \"acc_norm\": 0.4658703071672355,\n \"acc_norm_stderr\": 0.014577311315231102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.525094602668791,\n\
\ \"acc_stderr\": 0.004983492928102841,\n \"acc_norm\": 0.7227643895638319,\n\
\ \"acc_norm_stderr\": 0.004467189716140493\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37358490566037733,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.37358490566037733,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3741935483870968,\n\
\ \"acc_stderr\": 0.027528904299845783,\n \"acc_norm\": 0.3741935483870968,\n\
\ \"acc_norm_stderr\": 0.027528904299845783\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.0389853160557942,\n\
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.0389853160557942\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756775,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756775\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41968911917098445,\n \"acc_stderr\": 0.03561587327685882,\n\
\ \"acc_norm\": 0.41968911917098445,\n \"acc_norm_stderr\": 0.03561587327685882\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.02350757902064534,\n \
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.02350757902064534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31092436974789917,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.31092436974789917,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.43119266055045874,\n \"acc_stderr\": 0.02123336503031956,\n \"\
acc_norm\": 0.43119266055045874,\n \"acc_norm_stderr\": 0.02123336503031956\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5189873417721519,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.5189873417721519,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.31297709923664124,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.31297709923664124,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.045604560863872344,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.045604560863872344\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.0368035037128646,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.0368035037128646\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.03265903381186196,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.03265903381186196\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.44061302681992337,\n\
\ \"acc_stderr\": 0.017753396973908486,\n \"acc_norm\": 0.44061302681992337,\n\
\ \"acc_norm_stderr\": 0.017753396973908486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.33236994219653176,\n \"acc_stderr\": 0.025361168749688204,\n\
\ \"acc_norm\": 0.33236994219653176,\n \"acc_norm_stderr\": 0.025361168749688204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3633440514469453,\n\
\ \"acc_stderr\": 0.027316847674192714,\n \"acc_norm\": 0.3633440514469453,\n\
\ \"acc_norm_stderr\": 0.027316847674192714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.026869490744815254,\n\
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.026869490744815254\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29726205997392435,\n\
\ \"acc_stderr\": 0.01167334617308605,\n \"acc_norm\": 0.29726205997392435,\n\
\ \"acc_norm_stderr\": 0.01167334617308605\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3480392156862745,\n \"acc_stderr\": 0.019270998708223974,\n \
\ \"acc_norm\": 0.3480392156862745,\n \"acc_norm_stderr\": 0.019270998708223974\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.37272727272727274,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2979591836734694,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.2979591836734694,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3781094527363184,\n\
\ \"acc_stderr\": 0.03428867848778658,\n \"acc_norm\": 0.3781094527363184,\n\
\ \"acc_norm_stderr\": 0.03428867848778658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310936,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310936\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731627,\n \"mc2\": 0.3784184150795517,\n\
\ \"mc2_stderr\": 0.01406913882378111\n }\n}\n```"
repo_url: https://huggingface.co/lyogavin/Anima-7B-100K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-37-02.912169.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-37-02.912169.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-37-02.912169.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-37-02.912169.parquet'
- config_name: results
data_files:
- split: 2023_10_03T23_37_02.912169
path:
- results_2023-10-03T23-37-02.912169.parquet
- split: latest
path:
- results_2023-10-03T23-37-02.912169.parquet
---
# Dataset Card for Evaluation run of lyogavin/Anima-7B-100K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lyogavin/Anima-7B-100K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lyogavin/Anima-7B-100K](https://huggingface.co/lyogavin/Anima-7B-100K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lyogavin__Anima-7B-100K",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T23:37:02.912169](https://huggingface.co/datasets/open-llm-leaderboard/details_lyogavin__Anima-7B-100K/blob/main/results_2023-10-03T23-37-02.912169.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.33903815614723354,
"acc_stderr": 0.03394335857311012,
"acc_norm": 0.34277895834311756,
"acc_norm_stderr": 0.03393565406853249,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731627,
"mc2": 0.3784184150795517,
"mc2_stderr": 0.01406913882378111
},
"harness|arc:challenge|25": {
"acc": 0.44283276450511944,
"acc_stderr": 0.0145155738733489,
"acc_norm": 0.4658703071672355,
"acc_norm_stderr": 0.014577311315231102
},
"harness|hellaswag|10": {
"acc": 0.525094602668791,
"acc_stderr": 0.004983492928102841,
"acc_norm": 0.7227643895638319,
"acc_norm_stderr": 0.004467189716140493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37358490566037733,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.37358490566037733,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.03456425745086999,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.03456425745086999
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3741935483870968,
"acc_stderr": 0.027528904299845783,
"acc_norm": 0.3741935483870968,
"acc_norm_stderr": 0.027528904299845783
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.0389853160557942,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.0389853160557942
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41968911917098445,
"acc_stderr": 0.03561587327685882,
"acc_norm": 0.41968911917098445,
"acc_norm_stderr": 0.03561587327685882
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.02350757902064534,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.02350757902064534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31092436974789917,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.31092436974789917,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43119266055045874,
"acc_stderr": 0.02123336503031956,
"acc_norm": 0.43119266055045874,
"acc_norm_stderr": 0.02123336503031956
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5189873417721519,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.5189873417721519,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.31297709923664124,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.31297709923664124,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.045604560863872344,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.045604560863872344
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.0368035037128646,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.0368035037128646
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.03265903381186196,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.03265903381186196
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.44061302681992337,
"acc_stderr": 0.017753396973908486,
"acc_norm": 0.44061302681992337,
"acc_norm_stderr": 0.017753396973908486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.33236994219653176,
"acc_stderr": 0.025361168749688204,
"acc_norm": 0.33236994219653176,
"acc_norm_stderr": 0.025361168749688204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3633440514469453,
"acc_stderr": 0.027316847674192714,
"acc_norm": 0.3633440514469453,
"acc_norm_stderr": 0.027316847674192714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.026869490744815254,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.026869490744815254
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29726205997392435,
"acc_stderr": 0.01167334617308605,
"acc_norm": 0.29726205997392435,
"acc_norm_stderr": 0.01167334617308605
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3480392156862745,
"acc_stderr": 0.019270998708223974,
"acc_norm": 0.3480392156862745,
"acc_norm_stderr": 0.019270998708223974
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2979591836734694,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.2979591836734694,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3781094527363184,
"acc_stderr": 0.03428867848778658,
"acc_norm": 0.3781094527363184,
"acc_norm_stderr": 0.03428867848778658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03615507630310936,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03615507630310936
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731627,
"mc2": 0.3784184150795517,
"mc2_stderr": 0.01406913882378111
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-codellama2-34b-v11.1-bf16 | 2023-10-03T23:41:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16](https://huggingface.co/OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-codellama2-34b-v11.1-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T23:40:22.620996](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-codellama2-34b-v11.1-bf16/blob/main/results_2023-10-03T23-40-22.620996.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5550459864088625,\n\
\ \"acc_stderr\": 0.034737804810213574,\n \"acc_norm\": 0.5587640432720675,\n\
\ \"acc_norm_stderr\": 0.03473060679811294,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5300576050535195,\n\
\ \"mc2_stderr\": 0.015528670586705939\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4641638225255973,\n \"acc_stderr\": 0.014573813664735716,\n\
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.014611390804670088\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5283808006373233,\n\
\ \"acc_stderr\": 0.004981736689518747,\n \"acc_norm\": 0.7119099780920135,\n\
\ \"acc_norm_stderr\": 0.0045194768356467754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155236,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155236\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6258064516129033,\n\
\ \"acc_stderr\": 0.027528904299845704,\n \"acc_norm\": 0.6258064516129033,\n\
\ \"acc_norm_stderr\": 0.027528904299845704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.03459058815883231,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.03459058815883231\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.018776052319619627,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.018776052319619627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.032133257173736156,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.032133257173736156\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n\
\ \"acc_stderr\": 0.01622501794477097,\n \"acc_norm\": 0.7100893997445722,\n\
\ \"acc_norm_stderr\": 0.01622501794477097\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2860335195530726,\n\
\ \"acc_stderr\": 0.01511397212906214,\n \"acc_norm\": 0.2860335195530726,\n\
\ \"acc_norm_stderr\": 0.01511397212906214\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.02827549015679146,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.02827549015679146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581975,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581975\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606676,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606676\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543472,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.012510181636960672,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.012510181636960672\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5081699346405228,\n \"acc_stderr\": 0.02022513434305727,\n \
\ \"acc_norm\": 0.5081699346405228,\n \"acc_norm_stderr\": 0.02022513434305727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5300576050535195,\n\
\ \"mc2_stderr\": 0.015528670586705939\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-40-22.620996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-40-22.620996.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-40-22.620996.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-40-22.620996.parquet'
- config_name: results
data_files:
- split: 2023_10_03T23_40_22.620996
path:
- results_2023-10-03T23-40-22.620996.parquet
- split: latest
path:
- results_2023-10-03T23-40-22.620996.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16](https://huggingface.co/OpenBuddy/openbuddy-codellama2-34b-v11.1-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-codellama2-34b-v11.1-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T23:40:22.620996](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-codellama2-34b-v11.1-bf16/blob/main/results_2023-10-03T23-40-22.620996.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5550459864088625,
"acc_stderr": 0.034737804810213574,
"acc_norm": 0.5587640432720675,
"acc_norm_stderr": 0.03473060679811294,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5300576050535195,
"mc2_stderr": 0.015528670586705939
},
"harness|arc:challenge|25": {
"acc": 0.4641638225255973,
"acc_stderr": 0.014573813664735716,
"acc_norm": 0.5,
"acc_norm_stderr": 0.014611390804670088
},
"harness|hellaswag|10": {
"acc": 0.5283808006373233,
"acc_stderr": 0.004981736689518747,
"acc_norm": 0.7119099780920135,
"acc_norm_stderr": 0.0045194768356467754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155236,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155236
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.03459058815883231,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.03459058815883231
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.018776052319619627,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.018776052319619627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.032133257173736156,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.032133257173736156
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.01622501794477097,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.01622501794477097
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2860335195530726,
"acc_stderr": 0.01511397212906214,
"acc_norm": 0.2860335195530726,
"acc_norm_stderr": 0.01511397212906214
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.02827549015679146,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.02827549015679146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581975,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581975
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606676,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543472,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.012510181636960672,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.012510181636960672
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5081699346405228,
"acc_stderr": 0.02022513434305727,
"acc_norm": 0.5081699346405228,
"acc_norm_stderr": 0.02022513434305727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5300576050535195,
"mc2_stderr": 0.015528670586705939
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ccore/bible_formated | 2023-10-03T23:43:37.000Z | [
"license:mit",
"region:us"
] | ccore | null | null | null | 0 | 0 | ---
license: mit
---
|
open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft | 2023-10-03T23:45:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/Llama-2-7b-longlora-100k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-7b-longlora-100k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T23:44:33.008703](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft/blob/main/results_2023-10-03T23-44-33.008703.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23455922490809464,\n\
\ \"acc_stderr\": 0.030842949259564694,\n \"acc_norm\": 0.2358908940023703,\n\
\ \"acc_norm_stderr\": 0.03086717191290796,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104185,\n \"mc2\": 0.4905705645797878,\n\
\ \"mc2_stderr\": 0.016866766009940547\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20051194539249148,\n \"acc_stderr\": 0.011700318050499377,\n\
\ \"acc_norm\": 0.2815699658703072,\n \"acc_norm_stderr\": 0.013143376735009024\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2568213503286198,\n\
\ \"acc_stderr\": 0.004359871519639541,\n \"acc_norm\": 0.25433180641306513,\n\
\ \"acc_norm_stderr\": 0.004345949382382378\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.036845294917747094,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.036845294917747094\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198813,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198813\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.17096774193548386,\n \"acc_stderr\": 0.02141724293632157,\n \"\
acc_norm\": 0.17096774193548386,\n \"acc_norm_stderr\": 0.02141724293632157\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n \"\
acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180361,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180361\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423077,\n\
\ \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423077\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.024762902678057922,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.024762902678057922\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.0284588209914603,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.0284588209914603\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802277,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802277\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2006172839506173,\n \"acc_stderr\": 0.022282313949774885,\n\
\ \"acc_norm\": 0.2006172839506173,\n \"acc_norm_stderr\": 0.022282313949774885\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25684485006518903,\n\
\ \"acc_stderr\": 0.011158455853098853,\n \"acc_norm\": 0.25684485006518903,\n\
\ \"acc_norm_stderr\": 0.011158455853098853\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.02257177102549474,\n\
\ \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.02257177102549474\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146637,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.01489627744104185,\n \"mc2\": 0.4905705645797878,\n\
\ \"mc2_stderr\": 0.016866766009940547\n }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T23-44-33.008703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-44-33.008703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T23-44-33.008703.parquet'
- config_name: results
data_files:
- split: 2023_10_03T23_44_33.008703
path:
- results_2023-10-03T23-44-33.008703.parquet
- split: latest
path:
- results_2023-10-03T23-44-33.008703.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-7b-longlora-100k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-7b-longlora-100k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T23:44:33.008703](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-100k-ft/blob/main/results_2023-10-03T23-44-33.008703.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23455922490809464,
"acc_stderr": 0.030842949259564694,
"acc_norm": 0.2358908940023703,
"acc_norm_stderr": 0.03086717191290796,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104185,
"mc2": 0.4905705645797878,
"mc2_stderr": 0.016866766009940547
},
"harness|arc:challenge|25": {
"acc": 0.20051194539249148,
"acc_stderr": 0.011700318050499377,
"acc_norm": 0.2815699658703072,
"acc_norm_stderr": 0.013143376735009024
},
"harness|hellaswag|10": {
"acc": 0.2568213503286198,
"acc_stderr": 0.004359871519639541,
"acc_norm": 0.25433180641306513,
"acc_norm_stderr": 0.004345949382382378
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.036845294917747094,
"acc_norm": 0.16,
"acc_norm_stderr": 0.036845294917747094
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198813,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198813
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.17096774193548386,
"acc_stderr": 0.02141724293632157,
"acc_norm": 0.17096774193548386,
"acc_norm_stderr": 0.02141724293632157
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180361,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180361
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423077,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423077
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.024762902678057922,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.024762902678057922
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802277,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802277
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891148,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891148
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2006172839506173,
"acc_stderr": 0.022282313949774885,
"acc_norm": 0.2006172839506173,
"acc_norm_stderr": 0.022282313949774885
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25684485006518903,
"acc_stderr": 0.011158455853098853,
"acc_norm": 0.25684485006518903,
"acc_norm_stderr": 0.011158455853098853
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.02257177102549474,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.02257177102549474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146637,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104185,
"mc2": 0.4905705645797878,
"mc2_stderr": 0.016866766009940547
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-9dc6b5e6-3ef6-49ed-9316-24ea367f9cf2 | 2023-10-04T00:01:48.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
zkdeng/black_widows_full | 2023-10-03T23:55:39.000Z | [
"license:apache-2.0",
"region:us"
] | zkdeng | null | null | null | 0 | 0 | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Lactrodectus_hesperus
'1': Parasteatoda_tepidariorum
splits:
- name: train
num_bytes: 305921410.732
num_examples: 14894
download_size: 302846432
dataset_size: 305921410.732
---
|
teragron/llama2c_pretokenizer | 2023-10-03T23:54:52.000Z | [
"region:us"
] | teragron | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-f3210cf0-a8c7-4386-a336-efd4f384d2c4 | 2023-10-04T00:14:49.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-16k | 2023-10-04T00:05:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of totally-not-an-llm/EverythingLM-13b-V3-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [totally-not-an-llm/EverythingLM-13b-V3-16k](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-16k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-16k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:03:41.509774](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-16k/blob/main/results_2023-10-04T00-03-41.509774.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5069978955792468,\n\
\ \"acc_stderr\": 0.035156221910441904,\n \"acc_norm\": 0.5111282054767287,\n\
\ \"acc_norm_stderr\": 0.03513827774864239,\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.451848676236298,\n\
\ \"mc2_stderr\": 0.01535667204270541\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5366894197952219,\n \"acc_stderr\": 0.014572000527756993,\n\
\ \"acc_norm\": 0.5819112627986348,\n \"acc_norm_stderr\": 0.01441398839699608\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6027683728340968,\n\
\ \"acc_stderr\": 0.004883246579496666,\n \"acc_norm\": 0.8012348137821151,\n\
\ \"acc_norm_stderr\": 0.003982553164086264\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.04068590050224971,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.04068590050224971\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.031565646822367836,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.031565646822367836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.027709359675032495,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.027709359675032495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877793,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877793\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954935,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954935\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.032473902765696686,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.032473902765696686\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.691743119266055,\n \"acc_stderr\": 0.01979836669836725,\n \"acc_norm\"\
: 0.691743119266055,\n \"acc_norm_stderr\": 0.01979836669836725\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591519,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591519\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041694,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041694\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.029343114798094455,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.029343114798094455\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6743295019157088,\n\
\ \"acc_stderr\": 0.016757989458549682,\n \"acc_norm\": 0.6743295019157088,\n\
\ \"acc_norm_stderr\": 0.016757989458549682\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.02675625512966377,\n\
\ \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.02675625512966377\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02852638345214264,\n\
\ \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02852638345214264\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930477,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930477\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662737,\n\
\ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115886,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3617992177314211,\n\
\ \"acc_stderr\": 0.012272736233262943,\n \"acc_norm\": 0.3617992177314211,\n\
\ \"acc_norm_stderr\": 0.012272736233262943\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.02997280717046463,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.02997280717046463\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4624183006535948,\n \"acc_stderr\": 0.020170614974969775,\n \
\ \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.020170614974969775\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348643,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.451848676236298,\n\
\ \"mc2_stderr\": 0.01535667204270541\n }\n}\n```"
repo_url: https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-03-41.509774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-03-41.509774.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-03-41.509774.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-03-41.509774.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_03_41.509774
path:
- results_2023-10-04T00-03-41.509774.parquet
- split: latest
path:
- results_2023-10-04T00-03-41.509774.parquet
---
# Dataset Card for Evaluation run of totally-not-an-llm/EverythingLM-13b-V3-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [totally-not-an-llm/EverythingLM-13b-V3-16k](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-16k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:03:41.509774](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-16k/blob/main/results_2023-10-04T00-03-41.509774.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5069978955792468,
"acc_stderr": 0.035156221910441904,
"acc_norm": 0.5111282054767287,
"acc_norm_stderr": 0.03513827774864239,
"mc1": 0.31946144430844553,
"mc1_stderr": 0.0163226441829605,
"mc2": 0.451848676236298,
"mc2_stderr": 0.01535667204270541
},
"harness|arc:challenge|25": {
"acc": 0.5366894197952219,
"acc_stderr": 0.014572000527756993,
"acc_norm": 0.5819112627986348,
"acc_norm_stderr": 0.01441398839699608
},
"harness|hellaswag|10": {
"acc": 0.6027683728340968,
"acc_stderr": 0.004883246579496666,
"acc_norm": 0.8012348137821151,
"acc_norm_stderr": 0.003982553164086264
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.04068590050224971,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.04068590050224971
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.031565646822367836,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.031565646822367836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.027709359675032495,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.027709359675032495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.037818873532059816,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.037818873532059816
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877793,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877793
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.025334667080954935,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.025334667080954935
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.032473902765696686,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.032473902765696686
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.691743119266055,
"acc_stderr": 0.01979836669836725,
"acc_norm": 0.691743119266055,
"acc_norm_stderr": 0.01979836669836725
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591519,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591519
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285712,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285712
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041694,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041694
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.029343114798094455,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.029343114798094455
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6743295019157088,
"acc_stderr": 0.016757989458549682,
"acc_norm": 0.6743295019157088,
"acc_norm_stderr": 0.016757989458549682
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.02675625512966377,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.02675625512966377
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02852638345214264,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02852638345214264
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930477,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930477
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662737,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115886,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3617992177314211,
"acc_stderr": 0.012272736233262943,
"acc_norm": 0.3617992177314211,
"acc_norm_stderr": 0.012272736233262943
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.02997280717046463,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.02997280717046463
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4624183006535948,
"acc_stderr": 0.020170614974969775,
"acc_norm": 0.4624183006535948,
"acc_norm_stderr": 0.020170614974969775
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31946144430844553,
"mc1_stderr": 0.0163226441829605,
"mc2": 0.451848676236298,
"mc2_stderr": 0.01535667204270541
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atomi-labs/sml_gold_schema_test | 2023-10-04T00:07:08.000Z | [
"region:us"
] | atomi-labs | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: subject_group_code
dtype: string
- name: question_type
dtype: string
- name: reference_answer_type
dtype: string
- name: question
dtype: string
- name: reference_answer
dtype: string
- name: student_answer
dtype: string
- name: label
dtype: string
- name: test_type
dtype: string
- name: text
dtype: string
- name: question_unique_id
dtype: string
- name: question_attempt_id
dtype: string
- name: confidence_score
dtype: float64
- name: labelling_postprocessing_run_timestamp
dtype: string
- name: post_id
dtype: int64
- name: module_id
dtype: int64
- name: topic_id
dtype: int64
- name: subtopic_id
dtype: int64
- name: question_attempt_timestamp
dtype: 'null'
- name: html_url
dtype: 'null'
- name: annotation_type_category
dtype: string
- name: annotation_type
dtype: string
- name: labelling_function
dtype: string
- name: dataset_preparation_run_id
dtype: string
- name: labelling_postprocessing_run_id
dtype: string
splits:
- name: train
num_bytes: 1804613
num_examples: 1364
- name: test
num_bytes: 1117505
num_examples: 893
download_size: 637008
dataset_size: 2922118
---
# Dataset Card for "sml_gold_schema_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atomi-labs/sml_silver_schema_test | 2023-10-04T00:07:22.000Z | [
"region:us"
] | atomi-labs | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: subject_group_code
dtype: string
- name: question_type
dtype: string
- name: reference_answer_type
dtype: string
- name: question
dtype: string
- name: reference_answer
dtype: string
- name: student_answer
dtype: string
- name: label
dtype: string
- name: test_type
dtype: string
- name: text
dtype: string
- name: question_unique_id
dtype: string
- name: question_attempt_id
dtype: string
- name: confidence_score
dtype: string
- name: labelling_postprocessing_run_timestamp
dtype: string
- name: annotation_type_category
dtype: string
- name: annotation_type
dtype: string
- name: labelling_function
dtype: string
- name: post_id
dtype: int64
- name: module_id
dtype: int64
- name: topic_id
dtype: int64
- name: subtopic_id
dtype: int64
- name: question_attempt_timestamp
dtype: 'null'
- name: html_url
dtype: 'null'
- name: dataset_preparation_run_id
dtype: string
- name: labelling_postprocessing_run_id
dtype: string
splits:
- name: train
num_bytes: 28098257
num_examples: 21714
- name: test
num_bytes: 1112147
num_examples: 893
download_size: 4343033
dataset_size: 29210404
---
# Dataset Card for "sml_silver_schema_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_L-R__LLmRa-1.3B | 2023-10-04T00:14:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of L-R/LLmRa-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [L-R/LLmRa-1.3B](https://huggingface.co/L-R/LLmRa-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_L-R__LLmRa-1.3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:12:55.866010](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-1.3B/blob/main/results_2023-10-04T00-12-55.866010.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23719010007900507,\n\
\ \"acc_stderr\": 0.030767351862012883,\n \"acc_norm\": 0.23996136122760237,\n\
\ \"acc_norm_stderr\": 0.03077121539116864,\n \"mc1\": 0.22031823745410037,\n\
\ \"mc1_stderr\": 0.01450904517148729,\n \"mc2\": 0.3621367131719433,\n\
\ \"mc2_stderr\": 0.013796494197668843\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.302901023890785,\n \"acc_stderr\": 0.013428241573185349,\n\
\ \"acc_norm\": 0.3267918088737201,\n \"acc_norm_stderr\": 0.013706665975587338\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44811790479984065,\n\
\ \"acc_stderr\": 0.004962846206125494,\n \"acc_norm\": 0.5877315275841466,\n\
\ \"acc_norm_stderr\": 0.004912370023913024\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066652,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066652\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2037735849056604,\n \"acc_stderr\": 0.0247907845017754,\n\
\ \"acc_norm\": 0.2037735849056604,\n \"acc_norm_stderr\": 0.0247907845017754\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.15,\n\
\ \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.02185150982203172,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.02185150982203172\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243183,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243183\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.22903225806451613,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.02805779167298901,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.02805779167298901\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752947,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230186,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230186\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.025859164122051467,\n\
\ \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.025859164122051467\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.017149858514250937,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.017149858514250937\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n\
\ \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n\
\ \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.2600896860986547,\n\
\ \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n\
\ \"acc_stderr\": 0.015302380123542075,\n \"acc_norm\": 0.2413793103448276,\n\
\ \"acc_norm_stderr\": 0.015302380123542075\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.01414957534897627,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.01414957534897627\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19614147909967847,\n\
\ \"acc_stderr\": 0.022552447780478022,\n \"acc_norm\": 0.19614147909967847,\n\
\ \"acc_norm_stderr\": 0.022552447780478022\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262203,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262203\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307713,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307713\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884601,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884601\n },\n\
\ \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1673469387755102,\n\
\ \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.1673469387755102,\n\
\ \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.02947525023601718,\n\
\ \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.02947525023601718\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21637426900584794,\n\
\ \"acc_stderr\": 0.031581495393387345,\n \"acc_norm\": 0.21637426900584794,\n\
\ \"acc_norm_stderr\": 0.031581495393387345\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.01450904517148729,\n\
\ \"mc2\": 0.3621367131719433,\n \"mc2_stderr\": 0.013796494197668843\n\
\ }\n}\n```"
repo_url: https://huggingface.co/L-R/LLmRa-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-12-55.866010.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- results_2023-10-04T00-12-55.866010.parquet
- split: latest
path:
- results_2023-10-04T00-12-55.866010.parquet
---
# Dataset Card for Evaluation run of L-R/LLmRa-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/L-R/LLmRa-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [L-R/LLmRa-1.3B](https://huggingface.co/L-R/LLmRa-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_L-R__LLmRa-1.3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:12:55.866010](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-1.3B/blob/main/results_2023-10-04T00-12-55.866010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23719010007900507,
"acc_stderr": 0.030767351862012883,
"acc_norm": 0.23996136122760237,
"acc_norm_stderr": 0.03077121539116864,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.01450904517148729,
"mc2": 0.3621367131719433,
"mc2_stderr": 0.013796494197668843
},
"harness|arc:challenge|25": {
"acc": 0.302901023890785,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.3267918088737201,
"acc_norm_stderr": 0.013706665975587338
},
"harness|hellaswag|10": {
"acc": 0.44811790479984065,
"acc_stderr": 0.004962846206125494,
"acc_norm": 0.5877315275841466,
"acc_norm_stderr": 0.004912370023913024
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066652,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2037735849056604,
"acc_stderr": 0.0247907845017754,
"acc_norm": 0.2037735849056604,
"acc_norm_stderr": 0.0247907845017754
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173043,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173043
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.02185150982203172,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.02185150982203172
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243183,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243183
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.2,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752947,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230186,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230186
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19747899159663865,
"acc_stderr": 0.025859164122051467,
"acc_norm": 0.19747899159663865,
"acc_norm_stderr": 0.025859164122051467
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2,
"acc_stderr": 0.017149858514250937,
"acc_norm": 0.2,
"acc_norm_stderr": 0.017149858514250937
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.015302380123542075,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.015302380123542075
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.01414957534897627,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.01414957534897627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19614147909967847,
"acc_stderr": 0.022552447780478022,
"acc_norm": 0.19614147909967847,
"acc_norm_stderr": 0.022552447780478022
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262203,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307713,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307713
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.02947525023601718,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.02947525023601718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.031581495393387345,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.031581495393387345
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.01450904517148729,
"mc2": 0.3621367131719433,
"mc2_stderr": 0.013796494197668843
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-d4382040-b991-4721-a2df-eb2fd8fed1f2 | 2023-10-04T00:28:50.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_dotvignesh__perry-7b | 2023-10-04T00:16:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dotvignesh/perry-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dotvignesh/perry-7b](https://huggingface.co/dotvignesh/perry-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dotvignesh__perry-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:15:19.939384](https://huggingface.co/datasets/open-llm-leaderboard/details_dotvignesh__perry-7b/blob/main/results_2023-10-04T00-15-19.939384.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46376192978198955,\n\
\ \"acc_stderr\": 0.03509876929191512,\n \"acc_norm\": 0.4678690709500716,\n\
\ \"acc_norm_stderr\": 0.03508714508699615,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4008031328552837,\n\
\ \"mc2_stderr\": 0.014310534656953405\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46928327645051193,\n \"acc_stderr\": 0.014583792546304038,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490975\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5706034654451304,\n\
\ \"acc_stderr\": 0.004939784311448985,\n \"acc_norm\": 0.7642899820752838,\n\
\ \"acc_norm_stderr\": 0.004235743182042551\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n \
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.030772653642075657,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.030772653642075657\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.5096774193548387,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\
acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4256410256410256,\n \"acc_stderr\": 0.025069094387296532,\n\
\ \"acc_norm\": 0.4256410256410256,\n \"acc_norm_stderr\": 0.025069094387296532\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844065,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844065\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.0316314580755238,\n \
\ \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.0316314580755238\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6073394495412844,\n\
\ \"acc_stderr\": 0.020937505161201093,\n \"acc_norm\": 0.6073394495412844,\n\
\ \"acc_norm_stderr\": 0.020937505161201093\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.03128039084329883,\n\
\ \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.03128039084329883\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5882352941176471,\n \"acc_stderr\": 0.03454236585380608,\n \"\
acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03454236585380608\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6118143459915611,\n \"acc_stderr\": 0.031722950043323275,\n \
\ \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.031722950043323275\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.039015918258361836,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.039015918258361836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6500638569604087,\n\
\ \"acc_stderr\": 0.017055679797150426,\n \"acc_norm\": 0.6500638569604087,\n\
\ \"acc_norm_stderr\": 0.017055679797150426\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925296,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925296\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n\
\ \"acc_stderr\": 0.028290869054197608,\n \"acc_norm\": 0.5434083601286174,\n\
\ \"acc_norm_stderr\": 0.028290869054197608\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3226857887874837,\n\
\ \"acc_stderr\": 0.01194026419319599,\n \"acc_norm\": 0.3226857887874837,\n\
\ \"acc_norm_stderr\": 0.01194026419319599\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.03000856284500348,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.03000856284500348\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4199346405228758,\n \"acc_stderr\": 0.019966811178256487,\n \
\ \"acc_norm\": 0.4199346405228758,\n \"acc_norm_stderr\": 0.019966811178256487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310936,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310936\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4008031328552837,\n\
\ \"mc2_stderr\": 0.014310534656953405\n }\n}\n```"
repo_url: https://huggingface.co/dotvignesh/perry-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-15-19.939384.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-15-19.939384.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-15-19.939384.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_15_19.939384
path:
- results_2023-10-04T00-15-19.939384.parquet
- split: latest
path:
- results_2023-10-04T00-15-19.939384.parquet
---
# Dataset Card for Evaluation run of dotvignesh/perry-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dotvignesh/perry-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dotvignesh/perry-7b](https://huggingface.co/dotvignesh/perry-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dotvignesh__perry-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:15:19.939384](https://huggingface.co/datasets/open-llm-leaderboard/details_dotvignesh__perry-7b/blob/main/results_2023-10-04T00-15-19.939384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46376192978198955,
"acc_stderr": 0.03509876929191512,
"acc_norm": 0.4678690709500716,
"acc_norm_stderr": 0.03508714508699615,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4008031328552837,
"mc2_stderr": 0.014310534656953405
},
"harness|arc:challenge|25": {
"acc": 0.46928327645051193,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490975
},
"harness|hellaswag|10": {
"acc": 0.5706034654451304,
"acc_stderr": 0.004939784311448985,
"acc_norm": 0.7642899820752838,
"acc_norm_stderr": 0.004235743182042551
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.030772653642075657,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.030772653642075657
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4256410256410256,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.4256410256410256,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844065,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3865546218487395,
"acc_stderr": 0.0316314580755238,
"acc_norm": 0.3865546218487395,
"acc_norm_stderr": 0.0316314580755238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6073394495412844,
"acc_stderr": 0.020937505161201093,
"acc_norm": 0.6073394495412844,
"acc_norm_stderr": 0.020937505161201093
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.03128039084329883,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.03128039084329883
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03454236585380608,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03454236585380608
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.031722950043323275,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.031722950043323275
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6500638569604087,
"acc_stderr": 0.017055679797150426,
"acc_norm": 0.6500638569604087,
"acc_norm_stderr": 0.017055679797150426
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48265895953757226,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.48265895953757226,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925296,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925296
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.028290869054197608,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.028290869054197608
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3226857887874837,
"acc_stderr": 0.01194026419319599,
"acc_norm": 0.3226857887874837,
"acc_norm_stderr": 0.01194026419319599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.03000856284500348,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.03000856284500348
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4199346405228758,
"acc_stderr": 0.019966811178256487,
"acc_norm": 0.4199346405228758,
"acc_norm_stderr": 0.019966811178256487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310936,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310936
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4008031328552837,
"mc2_stderr": 0.014310534656953405
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat-bilingual | 2023-10-04T00:22:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LeoLM/leo-hessianai-7b-chat-bilingual
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeoLM/leo-hessianai-7b-chat-bilingual](https://huggingface.co/LeoLM/leo-hessianai-7b-chat-bilingual)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat-bilingual\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:21:31.959346](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat-bilingual/blob/main/results_2023-10-04T00-21-31.959346.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4495842051388625,\n\
\ \"acc_stderr\": 0.035309180456332794,\n \"acc_norm\": 0.45322784210770745,\n\
\ \"acc_norm_stderr\": 0.035297608659625966,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.47163146796973165,\n\
\ \"mc2_stderr\": 0.015355149347799014\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4880546075085324,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5102389078498294,\n \"acc_norm_stderr\": 0.014608326906285012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5675164309898426,\n\
\ \"acc_stderr\": 0.00494408060504877,\n \"acc_norm\": 0.7603067118103963,\n\
\ \"acc_norm_stderr\": 0.004260238033657902\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353229,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353229\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.031639106653672915,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.031639106653672915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373056,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373056\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5483870967741935,\n\
\ \"acc_stderr\": 0.02831050034856839,\n \"acc_norm\": 0.5483870967741935,\n\
\ \"acc_norm_stderr\": 0.02831050034856839\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103872,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5606060606060606,\n\
\ \"acc_stderr\": 0.0353608594752948,\n \"acc_norm\": 0.5606060606060606,\n\
\ \"acc_norm_stderr\": 0.0353608594752948\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6217616580310881,\n \"acc_stderr\": 0.03499807276193338,\n\
\ \"acc_norm\": 0.6217616580310881,\n \"acc_norm_stderr\": 0.03499807276193338\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.02466674491518722,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.02466674491518722\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.618348623853211,\n \"acc_stderr\": 0.02082814851702259,\n \"acc_norm\"\
: 0.618348623853211,\n \"acc_norm_stderr\": 0.02082814851702259\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n\
\ \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.39814814814814814,\n\
\ \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015477,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015477\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.620253164556962,\n \"acc_stderr\": 0.031591887529658504,\n \
\ \"acc_norm\": 0.620253164556962,\n \"acc_norm_stderr\": 0.031591887529658504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.048257293373563895,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.048257293373563895\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5854700854700855,\n\
\ \"acc_stderr\": 0.03227396567623779,\n \"acc_norm\": 0.5854700854700855,\n\
\ \"acc_norm_stderr\": 0.03227396567623779\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.6104725415070242,\n \"acc_stderr\": 0.017438082556264597,\n\
\ \"acc_norm\": 0.6104725415070242,\n \"acc_norm_stderr\": 0.017438082556264597\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.026636539741116072,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.026636539741116072\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n\
\ \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49673202614379086,\n\
\ \"acc_stderr\": 0.02862930519400354,\n \"acc_norm\": 0.49673202614379086,\n\
\ \"acc_norm_stderr\": 0.02862930519400354\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5627009646302251,\n \"acc_stderr\": 0.028173917761762906,\n\
\ \"acc_norm\": 0.5627009646302251,\n \"acc_norm_stderr\": 0.028173917761762906\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.027815973433878014,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.027815973433878014\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.02826765748265015,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.02826765748265015\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3898305084745763,\n\
\ \"acc_stderr\": 0.012456386619082606,\n \"acc_norm\": 0.3898305084745763,\n\
\ \"acc_norm_stderr\": 0.012456386619082606\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4133986928104575,\n \"acc_stderr\": 0.01992211568278667,\n \
\ \"acc_norm\": 0.4133986928104575,\n \"acc_norm_stderr\": 0.01992211568278667\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421397,\n\
\ \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421397\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.034375193373382504,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.034375193373382504\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6023391812865497,\n \"acc_stderr\": 0.03753638955761691,\n\
\ \"acc_norm\": 0.6023391812865497,\n \"acc_norm_stderr\": 0.03753638955761691\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.47163146796973165,\n\
\ \"mc2_stderr\": 0.015355149347799014\n }\n}\n```"
repo_url: https://huggingface.co/LeoLM/leo-hessianai-7b-chat-bilingual
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-21-31.959346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-21-31.959346.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-21-31.959346.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-21-31.959346.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_21_31.959346
path:
- results_2023-10-04T00-21-31.959346.parquet
- split: latest
path:
- results_2023-10-04T00-21-31.959346.parquet
---
# Dataset Card for Evaluation run of LeoLM/leo-hessianai-7b-chat-bilingual
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LeoLM/leo-hessianai-7b-chat-bilingual
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LeoLM/leo-hessianai-7b-chat-bilingual](https://huggingface.co/LeoLM/leo-hessianai-7b-chat-bilingual) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat-bilingual",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:21:31.959346](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat-bilingual/blob/main/results_2023-10-04T00-21-31.959346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4495842051388625,
"acc_stderr": 0.035309180456332794,
"acc_norm": 0.45322784210770745,
"acc_norm_stderr": 0.035297608659625966,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.47163146796973165,
"mc2_stderr": 0.015355149347799014
},
"harness|arc:challenge|25": {
"acc": 0.4880546075085324,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5102389078498294,
"acc_norm_stderr": 0.014608326906285012
},
"harness|hellaswag|10": {
"acc": 0.5675164309898426,
"acc_stderr": 0.00494408060504877,
"acc_norm": 0.7603067118103963,
"acc_norm_stderr": 0.004260238033657902
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353229,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353229
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373056,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5483870967741935,
"acc_stderr": 0.02831050034856839,
"acc_norm": 0.5483870967741935,
"acc_norm_stderr": 0.02831050034856839
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5606060606060606,
"acc_stderr": 0.0353608594752948,
"acc_norm": 0.5606060606060606,
"acc_norm_stderr": 0.0353608594752948
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6217616580310881,
"acc_stderr": 0.03499807276193338,
"acc_norm": 0.6217616580310881,
"acc_norm_stderr": 0.03499807276193338
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.618348623853211,
"acc_stderr": 0.02082814851702259,
"acc_norm": 0.618348623853211,
"acc_norm_stderr": 0.02082814851702259
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015477,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015477
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.620253164556962,
"acc_stderr": 0.031591887529658504,
"acc_norm": 0.620253164556962,
"acc_norm_stderr": 0.031591887529658504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.048257293373563895,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.048257293373563895
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5854700854700855,
"acc_stderr": 0.03227396567623779,
"acc_norm": 0.5854700854700855,
"acc_norm_stderr": 0.03227396567623779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6104725415070242,
"acc_stderr": 0.017438082556264597,
"acc_norm": 0.6104725415070242,
"acc_norm_stderr": 0.017438082556264597
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.026636539741116072,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.026636539741116072
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.028173917761762906,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.028173917761762906
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.02826765748265015,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.02826765748265015
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3898305084745763,
"acc_stderr": 0.012456386619082606,
"acc_norm": 0.3898305084745763,
"acc_norm_stderr": 0.012456386619082606
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4133986928104575,
"acc_stderr": 0.01992211568278667,
"acc_norm": 0.4133986928104575,
"acc_norm_stderr": 0.01992211568278667
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421397,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421397
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.034375193373382504,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.034375193373382504
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6023391812865497,
"acc_stderr": 0.03753638955761691,
"acc_norm": 0.6023391812865497,
"acc_norm_stderr": 0.03753638955761691
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.47163146796973165,
"mc2_stderr": 0.015355149347799014
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b | 2023-10-04T00:23:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-dolphin-orca-platypus-34b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-dolphin-orca-platypus-34b](https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:22:19.968928](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b/blob/main/results_2023-10-04T00-22-19.968928.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5342362830958945,\n\
\ \"acc_stderr\": 0.034949132938444705,\n \"acc_norm\": 0.538004053070666,\n\
\ \"acc_norm_stderr\": 0.03493872989072948,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.47135907975593017,\n\
\ \"mc2_stderr\": 0.014951001296424498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49573378839590443,\n \"acc_stderr\": 0.014610858923956945,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.547998406691894,\n\
\ \"acc_stderr\": 0.004966736811010487,\n \"acc_norm\": 0.7412865962955587,\n\
\ \"acc_norm_stderr\": 0.004370328224831782\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236395,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236395\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4830188679245283,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.4830188679245283,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520203,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520203\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5806451612903226,\n\
\ \"acc_stderr\": 0.02807158890109185,\n \"acc_norm\": 0.5806451612903226,\n\
\ \"acc_norm_stderr\": 0.02807158890109185\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.033456784227567746,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.033456784227567746\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.03141024780565317,\n\
\ \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.03141024780565317\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.02534267129380725,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115006,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\"\
: 0.7211009174311926,\n \"acc_norm_stderr\": 0.0192274688764635\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.033448873829978666,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.033448873829978666\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036427,\n \"\
acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036427\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.016857391247472552,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.016857391247472552\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5578034682080925,\n \"acc_stderr\": 0.0267386036438074,\n\
\ \"acc_norm\": 0.5578034682080925,\n \"acc_norm_stderr\": 0.0267386036438074\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095277,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095277\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.02850980780262659,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.02850980780262659\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484617,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.0272725828498398,\n\
\ \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.0272725828498398\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087371,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087371\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976694,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5310457516339869,\n \"acc_stderr\": 0.020188804456361887,\n \
\ \"acc_norm\": 0.5310457516339869,\n \"acc_norm_stderr\": 0.020188804456361887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6374269005847953,\n \"acc_stderr\": 0.0368713061556206,\n\
\ \"acc_norm\": 0.6374269005847953,\n \"acc_norm_stderr\": 0.0368713061556206\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.47135907975593017,\n\
\ \"mc2_stderr\": 0.014951001296424498\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-22-19.968928.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-22-19.968928.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-22-19.968928.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_22_19.968928
path:
- results_2023-10-04T00-22-19.968928.parquet
- split: latest
path:
- results_2023-10-04T00-22-19.968928.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-dolphin-orca-platypus-34b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-dolphin-orca-platypus-34b](https://huggingface.co/uukuguy/speechless-codellama-dolphin-orca-platypus-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:22:19.968928](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-dolphin-orca-platypus-34b/blob/main/results_2023-10-04T00-22-19.968928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5342362830958945,
"acc_stderr": 0.034949132938444705,
"acc_norm": 0.538004053070666,
"acc_norm_stderr": 0.03493872989072948,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.47135907975593017,
"mc2_stderr": 0.014951001296424498
},
"harness|arc:challenge|25": {
"acc": 0.49573378839590443,
"acc_stderr": 0.014610858923956945,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937738
},
"harness|hellaswag|10": {
"acc": 0.547998406691894,
"acc_stderr": 0.004966736811010487,
"acc_norm": 0.7412865962955587,
"acc_norm_stderr": 0.004370328224831782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.04060127035236395,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.04060127035236395
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4830188679245283,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.4830188679245283,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520203,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5806451612903226,
"acc_stderr": 0.02807158890109185,
"acc_norm": 0.5806451612903226,
"acc_norm_stderr": 0.02807158890109185
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.033456784227567746,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.033456784227567746
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.03141024780565317,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.03141024780565317
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.0192274688764635,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.0192274688764635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036427,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036427
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.016857391247472552,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.016857391247472552
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5578034682080925,
"acc_stderr": 0.0267386036438074,
"acc_norm": 0.5578034682080925,
"acc_norm_stderr": 0.0267386036438074
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095277,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095277
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.02850980780262659,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.02850980780262659
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484617,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.0272725828498398,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.0272725828498398
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087371,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087371
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.030254372573976694,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.030254372573976694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5310457516339869,
"acc_stderr": 0.020188804456361887,
"acc_norm": 0.5310457516339869,
"acc_norm_stderr": 0.020188804456361887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6374269005847953,
"acc_stderr": 0.0368713061556206,
"acc_norm": 0.6374269005847953,
"acc_norm_stderr": 0.0368713061556206
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.47135907975593017,
"mc2_stderr": 0.014951001296424498
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct | 2023-10-04T00:26:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of togethercomputer/Llama-2-7B-32K-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/Llama-2-7B-32K-Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:24:39.163717](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct/blob/main/results_2023-10-04T00-24-39.163717.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45784972757575226,\n\
\ \"acc_stderr\": 0.03516524590098152,\n \"acc_norm\": 0.46189376535094273,\n\
\ \"acc_norm_stderr\": 0.03515157907367711,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4500878213671718,\n\
\ \"mc2_stderr\": 0.014952301436755123\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.5136518771331058,\n \"acc_norm_stderr\": 0.01460594342986095\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5836486755626369,\n\
\ \"acc_stderr\": 0.004919457850104233,\n \"acc_norm\": 0.7847042421828321,\n\
\ \"acc_norm_stderr\": 0.004101873407354698\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490438,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490438\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.030635627957961823,\n\
\ \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.030635627957961823\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362466,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362466\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3931034482758621,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.3931034482758621,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655816,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655816\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924318,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924318\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n\
\ \"acc_stderr\": 0.02842268740431211,\n \"acc_norm\": 0.5193548387096775,\n\
\ \"acc_norm_stderr\": 0.02842268740431211\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187896,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4025641025641026,\n \"acc_stderr\": 0.024864995159767762,\n\
\ \"acc_norm\": 0.4025641025641026,\n \"acc_norm_stderr\": 0.024864995159767762\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6477064220183486,\n \"acc_stderr\": 0.020480568843998986,\n \"\
acc_norm\": 0.6477064220183486,\n \"acc_norm_stderr\": 0.020480568843998986\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.032149521478027486,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.032149521478027486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088299,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088299\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.030874537537553617,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.030874537537553617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.033516951676526276,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.033516951676526276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n\
\ \"acc_stderr\": 0.03078232157768817,\n \"acc_norm\": 0.6709401709401709,\n\
\ \"acc_norm_stderr\": 0.03078232157768817\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6257982120051085,\n\
\ \"acc_stderr\": 0.017304805072252037,\n \"acc_norm\": 0.6257982120051085,\n\
\ \"acc_norm_stderr\": 0.017304805072252037\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.02691864538323901,\n\
\ \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.02691864538323901\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.43790849673202614,\n \"acc_stderr\": 0.028408302020332694,\n\
\ \"acc_norm\": 0.43790849673202614,\n \"acc_norm_stderr\": 0.028408302020332694\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n\
\ \"acc_stderr\": 0.02832032583010592,\n \"acc_norm\": 0.5369774919614148,\n\
\ \"acc_norm_stderr\": 0.02832032583010592\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3644067796610169,\n\
\ \"acc_stderr\": 0.012291694983056475,\n \"acc_norm\": 0.3644067796610169,\n\
\ \"acc_norm_stderr\": 0.012291694983056475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.02997280717046462,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.02997280717046462\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872408,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872408\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421397,\n\
\ \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421397\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.037229657413855394,\n\
\ \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.037229657413855394\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.4500878213671718,\n\
\ \"mc2_stderr\": 0.014952301436755123\n }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-24-39.163717.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-24-39.163717.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-24-39.163717.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_24_39.163717
path:
- results_2023-10-04T00-24-39.163717.parquet
- split: latest
path:
- results_2023-10-04T00-24-39.163717.parquet
---
# Dataset Card for Evaluation run of togethercomputer/Llama-2-7B-32K-Instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/Llama-2-7B-32K-Instruct](https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:24:39.163717](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct/blob/main/results_2023-10-04T00-24-39.163717.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45784972757575226,
"acc_stderr": 0.03516524590098152,
"acc_norm": 0.46189376535094273,
"acc_norm_stderr": 0.03515157907367711,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.4500878213671718,
"mc2_stderr": 0.014952301436755123
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5136518771331058,
"acc_norm_stderr": 0.01460594342986095
},
"harness|hellaswag|10": {
"acc": 0.5836486755626369,
"acc_stderr": 0.004919457850104233,
"acc_norm": 0.7847042421828321,
"acc_norm_stderr": 0.004101873407354698
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490438,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490438
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.030635627957961823,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.030635627957961823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.036186648199362466,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.036186648199362466
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3931034482758621,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.3931034482758621,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655816,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655816
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924318,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924318
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4025641025641026,
"acc_stderr": 0.024864995159767762,
"acc_norm": 0.4025641025641026,
"acc_norm_stderr": 0.024864995159767762
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6477064220183486,
"acc_stderr": 0.020480568843998986,
"acc_norm": 0.6477064220183486,
"acc_norm_stderr": 0.020480568843998986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.032149521478027486,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.032149521478027486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088299,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088299
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.033516951676526276,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.033516951676526276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.03078232157768817,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.03078232157768817
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6257982120051085,
"acc_stderr": 0.017304805072252037,
"acc_norm": 0.6257982120051085,
"acc_norm_stderr": 0.017304805072252037
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.02691864538323901,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.02691864538323901
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.43790849673202614,
"acc_stderr": 0.028408302020332694,
"acc_norm": 0.43790849673202614,
"acc_norm_stderr": 0.028408302020332694
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.02832032583010592,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.02832032583010592
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3644067796610169,
"acc_stderr": 0.012291694983056475,
"acc_norm": 0.3644067796610169,
"acc_norm_stderr": 0.012291694983056475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.02997280717046462,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.02997280717046462
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872408,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872408
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421397,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421397
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.037229657413855394,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.037229657413855394
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559693,
"mc2": 0.4500878213671718,
"mc2_stderr": 0.014952301436755123
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat | 2023-10-04T00:29:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of LeoLM/leo-hessianai-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeoLM/leo-hessianai-7b-chat](https://huggingface.co/LeoLM/leo-hessianai-7b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:28:30.897823](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat/blob/main/results_2023-10-04T00-28-30.897823.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45818760509669304,\n\
\ \"acc_stderr\": 0.03541282966101462,\n \"acc_norm\": 0.46238611930946416,\n\
\ \"acc_norm_stderr\": 0.03539979649202152,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44890171447013644,\n\
\ \"mc2_stderr\": 0.015424897702943842\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47525597269624575,\n \"acc_stderr\": 0.014593487694937736,\n\
\ \"acc_norm\": 0.5255972696245734,\n \"acc_norm_stderr\": 0.014592230885298962\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5787691694881498,\n\
\ \"acc_stderr\": 0.004927473370720143,\n \"acc_norm\": 0.776140211113324,\n\
\ \"acc_norm_stderr\": 0.004159773209765883\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851302,\n\
\ \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851302\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.0314895582974553,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.0314895582974553\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101813,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101813\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5451612903225806,\n \"acc_stderr\": 0.028327743091561063,\n \"\
acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.028327743091561063\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.038154943086889305,\n\
\ \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.038154943086889305\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\"\
: 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6217616580310881,\n \"acc_stderr\": 0.03499807276193338,\n\
\ \"acc_norm\": 0.6217616580310881,\n \"acc_norm_stderr\": 0.03499807276193338\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37948717948717947,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.37948717948717947,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.02100420126042007,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.02100420126042007\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n\
\ \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5931372549019608,\n \"acc_stderr\": 0.034478911363533815,\n \"\
acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.034478911363533815\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5201793721973094,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.5201793721973094,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179662,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179662\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n\
\ \"acc_stderr\": 0.0312561082442188,\n \"acc_norm\": 0.6495726495726496,\n\
\ \"acc_norm_stderr\": 0.0312561082442188\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6028097062579821,\n\
\ \"acc_stderr\": 0.017497905037159367,\n \"acc_norm\": 0.6028097062579821,\n\
\ \"acc_norm_stderr\": 0.017497905037159367\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.02675625512966377,\n\
\ \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.02675625512966377\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925298,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925298\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.02827435985489425,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.02827435985489425\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5154320987654321,\n \"acc_stderr\": 0.0278074900442762,\n\
\ \"acc_norm\": 0.5154320987654321,\n \"acc_norm_stderr\": 0.0278074900442762\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347247,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347247\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3650586701434159,\n\
\ \"acc_stderr\": 0.012296373743443478,\n \"acc_norm\": 0.3650586701434159,\n\
\ \"acc_norm_stderr\": 0.012296373743443478\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43790849673202614,\n \"acc_stderr\": 0.02007125788688652,\n \
\ \"acc_norm\": 0.43790849673202614,\n \"acc_norm_stderr\": 0.02007125788688652\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4448979591836735,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.4448979591836735,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5870646766169154,\n\
\ \"acc_stderr\": 0.03481520803367348,\n \"acc_norm\": 0.5870646766169154,\n\
\ \"acc_norm_stderr\": 0.03481520803367348\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5906432748538012,\n \"acc_stderr\": 0.03771283107626545,\n\
\ \"acc_norm\": 0.5906432748538012,\n \"acc_norm_stderr\": 0.03771283107626545\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44890171447013644,\n\
\ \"mc2_stderr\": 0.015424897702943842\n }\n}\n```"
repo_url: https://huggingface.co/LeoLM/leo-hessianai-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-28-30.897823.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-28-30.897823.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-28-30.897823.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-28-30.897823.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_28_30.897823
path:
- results_2023-10-04T00-28-30.897823.parquet
- split: latest
path:
- results_2023-10-04T00-28-30.897823.parquet
---
# Dataset Card for Evaluation run of LeoLM/leo-hessianai-7b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LeoLM/leo-hessianai-7b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LeoLM/leo-hessianai-7b-chat](https://huggingface.co/LeoLM/leo-hessianai-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:28:30.897823](https://huggingface.co/datasets/open-llm-leaderboard/details_LeoLM__leo-hessianai-7b-chat/blob/main/results_2023-10-04T00-28-30.897823.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45818760509669304,
"acc_stderr": 0.03541282966101462,
"acc_norm": 0.46238611930946416,
"acc_norm_stderr": 0.03539979649202152,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44890171447013644,
"mc2_stderr": 0.015424897702943842
},
"harness|arc:challenge|25": {
"acc": 0.47525597269624575,
"acc_stderr": 0.014593487694937736,
"acc_norm": 0.5255972696245734,
"acc_norm_stderr": 0.014592230885298962
},
"harness|hellaswag|10": {
"acc": 0.5787691694881498,
"acc_stderr": 0.004927473370720143,
"acc_norm": 0.776140211113324,
"acc_norm_stderr": 0.004159773209765883
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101813,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101813
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561063,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561063
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.51010101010101,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.51010101010101,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6217616580310881,
"acc_stderr": 0.03499807276193338,
"acc_norm": 0.6217616580310881,
"acc_norm_stderr": 0.03499807276193338
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37948717948717947,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.37948717948717947,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6,
"acc_stderr": 0.02100420126042007,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02100420126042007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.034478911363533815,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.034478911363533815
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5201793721973094,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.5201793721973094,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179662,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179662
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6495726495726496,
"acc_stderr": 0.0312561082442188,
"acc_norm": 0.6495726495726496,
"acc_norm_stderr": 0.0312561082442188
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6028097062579821,
"acc_stderr": 0.017497905037159367,
"acc_norm": 0.6028097062579821,
"acc_norm_stderr": 0.017497905037159367
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.02675625512966377,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.02675625512966377
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925298,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489425,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489425
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347247,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347247
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3650586701434159,
"acc_stderr": 0.012296373743443478,
"acc_norm": 0.3650586701434159,
"acc_norm_stderr": 0.012296373743443478
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43790849673202614,
"acc_stderr": 0.02007125788688652,
"acc_norm": 0.43790849673202614,
"acc_norm_stderr": 0.02007125788688652
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4448979591836735,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.4448979591836735,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5870646766169154,
"acc_stderr": 0.03481520803367348,
"acc_norm": 0.5870646766169154,
"acc_norm_stderr": 0.03481520803367348
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5906432748538012,
"acc_stderr": 0.03771283107626545,
"acc_norm": 0.5906432748538012,
"acc_norm_stderr": 0.03771283107626545
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44890171447013644,
"mc2_stderr": 0.015424897702943842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-7c038f79-e101-4622-b3df-2e6d36e762b6 | 2023-10-04T00:42:38.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_openbmb__UltraLM-13b | 2023-10-04T00:34:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of openbmb/UltraLM-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openbmb/UltraLM-13b](https://huggingface.co/openbmb/UltraLM-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openbmb__UltraLM-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:32:52.750601](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-13b/blob/main/results_2023-10-04T00-32-52.750601.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2317456285682548,\n\
\ \"acc_stderr\": 0.03071580855473494,\n \"acc_norm\": 0.23272687783098286,\n\
\ \"acc_norm_stderr\": 0.030730390543818306,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4861387255603705,\n\
\ \"mc2_stderr\": 0.01574665894684377\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23976109215017063,\n \"acc_stderr\": 0.012476304127453947,\n\
\ \"acc_norm\": 0.29436860068259385,\n \"acc_norm_stderr\": 0.013318528460539426\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2566221868153754,\n\
\ \"acc_stderr\": 0.004358764596401033,\n \"acc_norm\": 0.2599083847839076,\n\
\ \"acc_norm_stderr\": 0.004376877619234126\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.015000674373570345,\n\
\ \"mc2\": 0.4861387255603705,\n \"mc2_stderr\": 0.01574665894684377\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openbmb/UltraLM-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-32-52.750601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-32-52.750601.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-32-52.750601.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-32-52.750601.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_32_52.750601
path:
- results_2023-10-04T00-32-52.750601.parquet
- split: latest
path:
- results_2023-10-04T00-32-52.750601.parquet
---
# Dataset Card for Evaluation run of openbmb/UltraLM-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openbmb/UltraLM-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openbmb/UltraLM-13b](https://huggingface.co/openbmb/UltraLM-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openbmb__UltraLM-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:32:52.750601](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-13b/blob/main/results_2023-10-04T00-32-52.750601.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2317456285682548,
"acc_stderr": 0.03071580855473494,
"acc_norm": 0.23272687783098286,
"acc_norm_stderr": 0.030730390543818306,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4861387255603705,
"mc2_stderr": 0.01574665894684377
},
"harness|arc:challenge|25": {
"acc": 0.23976109215017063,
"acc_stderr": 0.012476304127453947,
"acc_norm": 0.29436860068259385,
"acc_norm_stderr": 0.013318528460539426
},
"harness|hellaswag|10": {
"acc": 0.2566221868153754,
"acc_stderr": 0.004358764596401033,
"acc_norm": 0.2599083847839076,
"acc_norm_stderr": 0.004376877619234126
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4861387255603705,
"mc2_stderr": 0.01574665894684377
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt-en | 2023-10-04T00:35:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of HWERI/pythia-70m-deduped-cleansharegpt-en
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HWERI/pythia-70m-deduped-cleansharegpt-en](https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt-en)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt-en\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:34:36.927463](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt-en/blob/main/results_2023-10-04T00-34-36.927463.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25169656148677383,\n\
\ \"acc_stderr\": 0.031347346501202376,\n \"acc_norm\": 0.25208031444926915,\n\
\ \"acc_norm_stderr\": 0.03135407929770038,\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.01548369193923726,\n \"mc2\": 0.48569344141607573,\n\
\ \"mc2_stderr\": 0.015964870546396556\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1945392491467577,\n \"acc_stderr\": 0.011567709174648728,\n\
\ \"acc_norm\": 0.21160409556313994,\n \"acc_norm_stderr\": 0.011935916358632864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26598287193786097,\n\
\ \"acc_stderr\": 0.004409521343140105,\n \"acc_norm\": 0.27155945030870343,\n\
\ \"acc_norm_stderr\": 0.004438549152538039\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756192,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756192\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24193548387096775,\n \"acc_stderr\": 0.0243625996930311,\n \"\
acc_norm\": 0.24193548387096775,\n \"acc_norm_stderr\": 0.0243625996930311\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.031584153240477086,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.031584153240477086\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3316062176165803,\n \"acc_stderr\": 0.03397636541089116,\n\
\ \"acc_norm\": 0.3316062176165803,\n \"acc_norm_stderr\": 0.03397636541089116\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3384615384615385,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.3384615384615385,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587192,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587192\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786381,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786381\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21467889908256882,\n \"acc_stderr\": 0.017604304149256487,\n \"\
acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.017604304149256487\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23371647509578544,\n\
\ \"acc_stderr\": 0.01513338327898884,\n \"acc_norm\": 0.23371647509578544,\n\
\ \"acc_norm_stderr\": 0.01513338327898884\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.17684887459807075,\n\
\ \"acc_stderr\": 0.021670058885510803,\n \"acc_norm\": 0.17684887459807075,\n\
\ \"acc_norm_stderr\": 0.021670058885510803\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20987654320987653,\n \"acc_stderr\": 0.022658344085981368,\n\
\ \"acc_norm\": 0.20987654320987653,\n \"acc_norm_stderr\": 0.022658344085981368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913222,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913222\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.01548369193923726,\n \"mc2\": 0.48569344141607573,\n\
\ \"mc2_stderr\": 0.015964870546396556\n }\n}\n```"
repo_url: https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt-en
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-34-36.927463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-34-36.927463.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-34-36.927463.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-34-36.927463.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_34_36.927463
path:
- results_2023-10-04T00-34-36.927463.parquet
- split: latest
path:
- results_2023-10-04T00-34-36.927463.parquet
---
# Dataset Card for Evaluation run of HWERI/pythia-70m-deduped-cleansharegpt-en
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt-en
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HWERI/pythia-70m-deduped-cleansharegpt-en](https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt-en) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt-en",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:34:36.927463](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt-en/blob/main/results_2023-10-04T00-34-36.927463.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25169656148677383,
"acc_stderr": 0.031347346501202376,
"acc_norm": 0.25208031444926915,
"acc_norm_stderr": 0.03135407929770038,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.01548369193923726,
"mc2": 0.48569344141607573,
"mc2_stderr": 0.015964870546396556
},
"harness|arc:challenge|25": {
"acc": 0.1945392491467577,
"acc_stderr": 0.011567709174648728,
"acc_norm": 0.21160409556313994,
"acc_norm_stderr": 0.011935916358632864
},
"harness|hellaswag|10": {
"acc": 0.26598287193786097,
"acc_stderr": 0.004409521343140105,
"acc_norm": 0.27155945030870343,
"acc_norm_stderr": 0.004438549152538039
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756192,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756192
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.0243625996930311,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.0243625996930311
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.031584153240477086,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.031584153240477086
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3316062176165803,
"acc_stderr": 0.03397636541089116,
"acc_norm": 0.3316062176165803,
"acc_norm_stderr": 0.03397636541089116
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3384615384615385,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.3384615384615385,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587192,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587192
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.017604304149256487,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.017604304149256487
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23371647509578544,
"acc_stderr": 0.01513338327898884,
"acc_norm": 0.23371647509578544,
"acc_norm_stderr": 0.01513338327898884
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.17684887459807075,
"acc_stderr": 0.021670058885510803,
"acc_norm": 0.17684887459807075,
"acc_norm_stderr": 0.021670058885510803
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20987654320987653,
"acc_stderr": 0.022658344085981368,
"acc_norm": 0.20987654320987653,
"acc_norm_stderr": 0.022658344085981368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913222,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.01548369193923726,
"mc2": 0.48569344141607573,
"mc2_stderr": 0.015964870546396556
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
LLMGlobalyTest/Herramientas-1 | 2023-10-04T00:35:34.000Z | [
"region:us"
] | LLMGlobalyTest | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-ecce347c-b8e4-4e99-aec3-e30a53023852 | 2023-10-04T00:48:24.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Sao10K__Stheno-1.8-L2-13B | 2023-10-04T00:47:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Stheno-1.8-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-1.8-L2-13B](https://huggingface.co/Sao10K/Stheno-1.8-L2-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-1.8-L2-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T00:45:37.800224](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.8-L2-13B/blob/main/results_2023-10-04T00-45-37.800224.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5870585627938865,\n\
\ \"acc_stderr\": 0.03401678665014914,\n \"acc_norm\": 0.5908931171207953,\n\
\ \"acc_norm_stderr\": 0.03399400896031717,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5285573158387327,\n\
\ \"mc2_stderr\": 0.015665952477099693\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.0140702655192688\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6447918741286597,\n\
\ \"acc_stderr\": 0.00477598265035592,\n \"acc_norm\": 0.841167098187612,\n\
\ \"acc_norm_stderr\": 0.003647731723938836\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920935,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920935\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.026860206444724345,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.026860206444724345\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n \"\
acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.0249626835643318,\n \
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.0249626835643318\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7798165137614679,\n \"acc_stderr\": 0.01776597865232756,\n \"\
acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.01776597865232756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665224,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665224\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139956,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139956\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4949720670391061,\n\
\ \"acc_stderr\": 0.01672165603753843,\n \"acc_norm\": 0.4949720670391061,\n\
\ \"acc_norm_stderr\": 0.01672165603753843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.02679542232789394,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.02679542232789394\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729147,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6013071895424836,\n \"acc_stderr\": 0.019808281317449848,\n \
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.019808281317449848\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5285573158387327,\n\
\ \"mc2_stderr\": 0.015665952477099693\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-1.8-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-45-37.800224.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-45-37.800224.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-45-37.800224.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-45-37.800224.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_45_37.800224
path:
- results_2023-10-04T00-45-37.800224.parquet
- split: latest
path:
- results_2023-10-04T00-45-37.800224.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-1.8-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-1.8-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.8-L2-13B](https://huggingface.co/Sao10K/Stheno-1.8-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-1.8-L2-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T00:45:37.800224](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.8-L2-13B/blob/main/results_2023-10-04T00-45-37.800224.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5870585627938865,
"acc_stderr": 0.03401678665014914,
"acc_norm": 0.5908931171207953,
"acc_norm_stderr": 0.03399400896031717,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5285573158387327,
"mc2_stderr": 0.015665952477099693
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.0140702655192688
},
"harness|hellaswag|10": {
"acc": 0.6447918741286597,
"acc_stderr": 0.00477598265035592,
"acc_norm": 0.841167098187612,
"acc_norm_stderr": 0.003647731723938836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920935,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920935
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724345,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724345
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.0249626835643318,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.0249626835643318
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.01776597865232756,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.01776597865232756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665224,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665224
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139956,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139956
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4949720670391061,
"acc_stderr": 0.01672165603753843,
"acc_norm": 0.4949720670391061,
"acc_norm_stderr": 0.01672165603753843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.02679542232789394,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.02679542232789394
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729147,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532067,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125474,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.019808281317449848,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.019808281317449848
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5285573158387327,
"mc2_stderr": 0.015665952477099693
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-f613ed90-5d50-42bf-b8f0-25e973171822 | 2023-10-04T01:01:15.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
XisDraki3142/silasbr | 2023-10-04T00:58:11.000Z | [
"license:apache-2.0",
"region:us"
] | XisDraki3142 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
atom-in-the-universe/bild-c1b95a65-34ae-4aee-9f9c-5fb7651538a5 | 2023-10-04T01:16:41.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-fd86e73b-15f3-4f1c-85c9-1364b06f5a55 | 2023-10-04T01:29:31.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_SummerSigh__GPTNeo350M-Instruct-SFT | 2023-10-04T01:23:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of SummerSigh/GPTNeo350M-Instruct-SFT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SummerSigh/GPTNeo350M-Instruct-SFT](https://huggingface.co/SummerSigh/GPTNeo350M-Instruct-SFT)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SummerSigh__GPTNeo350M-Instruct-SFT\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T01:22:31.549356](https://huggingface.co/datasets/open-llm-leaderboard/details_SummerSigh__GPTNeo350M-Instruct-SFT/blob/main/results_2023-10-04T01-22-31.549356.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2583242277091245,\n\
\ \"acc_stderr\": 0.03170493074473348,\n \"acc_norm\": 0.25982160333495524,\n\
\ \"acc_norm_stderr\": 0.031717595305847064,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.45247532824340125,\n\
\ \"mc2_stderr\": 0.015698141799631457\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063295,\n\
\ \"acc_norm\": 0.2593856655290102,\n \"acc_norm_stderr\": 0.012808273573927113\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33041226847241584,\n\
\ \"acc_stderr\": 0.004694002781939553,\n \"acc_norm\": 0.38548097988448515,\n\
\ \"acc_norm_stderr\": 0.004857140410776745\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.0340654205850265,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.0340654205850265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.026055296901152922,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.026055296901152922\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102967,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102967\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.22486772486772486,\n \"acc_stderr\": 0.021502096078229147,\n \"\
acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.021502096078229147\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229886,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229886\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.03308818594415752,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.03308818594415752\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467298,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467298\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863786,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360383,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360383\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28256880733944956,\n \"acc_stderr\": 0.019304243497707152,\n \"\
acc_norm\": 0.28256880733944956,\n \"acc_norm_stderr\": 0.019304243497707152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425173,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425173\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2320675105485232,\n \"acc_stderr\": 0.02747974455080852,\n \
\ \"acc_norm\": 0.2320675105485232,\n \"acc_norm_stderr\": 0.02747974455080852\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.16143497757847533,\n\
\ \"acc_stderr\": 0.024693957899128472,\n \"acc_norm\": 0.16143497757847533,\n\
\ \"acc_norm_stderr\": 0.024693957899128472\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.047504583990416925,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.047504583990416925\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\
\ \"acc_stderr\": 0.01538435228454394,\n \"acc_norm\": 0.24521072796934865,\n\
\ \"acc_norm_stderr\": 0.01538435228454394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210735,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210735\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.025494259350694888,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.025494259350694888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0227797190887334,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0227797190887334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266726,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266726\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23728813559322035,\n\
\ \"acc_stderr\": 0.010865436690780269,\n \"acc_norm\": 0.23728813559322035,\n\
\ \"acc_norm_stderr\": 0.010865436690780269\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22549019607843138,\n \"acc_stderr\": 0.016906615927288145,\n \
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.016906615927288145\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721378,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721378\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.028666857790274655,\n\
\ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.028666857790274655\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117827,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117827\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.45247532824340125,\n\
\ \"mc2_stderr\": 0.015698141799631457\n }\n}\n```"
repo_url: https://huggingface.co/SummerSigh/GPTNeo350M-Instruct-SFT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-22-31.549356.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-22-31.549356.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-22-31.549356.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-22-31.549356.parquet'
- config_name: results
data_files:
- split: 2023_10_04T01_22_31.549356
path:
- results_2023-10-04T01-22-31.549356.parquet
- split: latest
path:
- results_2023-10-04T01-22-31.549356.parquet
---
# Dataset Card for Evaluation run of SummerSigh/GPTNeo350M-Instruct-SFT
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SummerSigh/GPTNeo350M-Instruct-SFT
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [SummerSigh/GPTNeo350M-Instruct-SFT](https://huggingface.co/SummerSigh/GPTNeo350M-Instruct-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SummerSigh__GPTNeo350M-Instruct-SFT",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T01:22:31.549356](https://huggingface.co/datasets/open-llm-leaderboard/details_SummerSigh__GPTNeo350M-Instruct-SFT/blob/main/results_2023-10-04T01-22-31.549356.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2583242277091245,
"acc_stderr": 0.03170493074473348,
"acc_norm": 0.25982160333495524,
"acc_norm_stderr": 0.031717595305847064,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522512,
"mc2": 0.45247532824340125,
"mc2_stderr": 0.015698141799631457
},
"harness|arc:challenge|25": {
"acc": 0.22610921501706485,
"acc_stderr": 0.012224202097063295,
"acc_norm": 0.2593856655290102,
"acc_norm_stderr": 0.012808273573927113
},
"harness|hellaswag|10": {
"acc": 0.33041226847241584,
"acc_stderr": 0.004694002781939553,
"acc_norm": 0.38548097988448515,
"acc_norm_stderr": 0.004857140410776745
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.0340654205850265,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.0340654205850265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.026055296901152922,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.026055296901152922
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102967,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102967
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.22486772486772486,
"acc_stderr": 0.021502096078229147,
"acc_norm": 0.22486772486772486,
"acc_norm_stderr": 0.021502096078229147
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229886,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229886
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.03308818594415752,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.03308818594415752
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467298,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467298
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863786,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360383,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360383
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28256880733944956,
"acc_stderr": 0.019304243497707152,
"acc_norm": 0.28256880733944956,
"acc_norm_stderr": 0.019304243497707152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425173,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425173
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2320675105485232,
"acc_stderr": 0.02747974455080852,
"acc_norm": 0.2320675105485232,
"acc_norm_stderr": 0.02747974455080852
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.16143497757847533,
"acc_stderr": 0.024693957899128472,
"acc_norm": 0.16143497757847533,
"acc_norm_stderr": 0.024693957899128472
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.047504583990416925,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.047504583990416925
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24521072796934865,
"acc_stderr": 0.01538435228454394,
"acc_norm": 0.24521072796934865,
"acc_norm_stderr": 0.01538435228454394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210735,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210735
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.025494259350694888,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.025494259350694888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266726,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266726
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23728813559322035,
"acc_stderr": 0.010865436690780269,
"acc_norm": 0.23728813559322035,
"acc_norm_stderr": 0.010865436690780269
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.016906615927288145,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.016906615927288145
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721378,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.028666857790274655,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.028666857790274655
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117827,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117827
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522512,
"mc2": 0.45247532824340125,
"mc2_stderr": 0.015698141799631457
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B | 2023-10-04T01:27:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Zephyrus-L1-33B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Zephyrus-L1-33B](https://huggingface.co/Sao10K/Zephyrus-L1-33B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T01:26:11.799613](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B/blob/main/results_2023-10-04T01-26-11.799613.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5755752386680311,\n\
\ \"acc_stderr\": 0.03433513463423277,\n \"acc_norm\": 0.5794362999338591,\n\
\ \"acc_norm_stderr\": 0.034311745756975556,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965824,\n \"mc2\": 0.5386582093302932,\n\
\ \"mc2_stderr\": 0.0153784519278671\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.014235872487909869,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094092\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6460864369647481,\n\
\ \"acc_stderr\": 0.0047720549044044415,\n \"acc_norm\": 0.8414658434574785,\n\
\ \"acc_norm_stderr\": 0.003644946730044613\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5509433962264151,\n \"acc_stderr\": 0.030612730713641095,\n\
\ \"acc_norm\": 0.5509433962264151,\n \"acc_norm_stderr\": 0.030612730713641095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028435,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028435\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091707,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.01513338327898883,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.01513338327898883\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895817,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429125,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429125\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02782610930728369,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02782610930728369\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714867,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714867\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829163,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829163\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965824,\n \"mc2\": 0.5386582093302932,\n\
\ \"mc2_stderr\": 0.0153784519278671\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Zephyrus-L1-33B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-26-11.799613.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-26-11.799613.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-26-11.799613.parquet'
- config_name: results
data_files:
- split: 2023_10_04T01_26_11.799613
path:
- results_2023-10-04T01-26-11.799613.parquet
- split: latest
path:
- results_2023-10-04T01-26-11.799613.parquet
---
# Dataset Card for Evaluation run of Sao10K/Zephyrus-L1-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Zephyrus-L1-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Zephyrus-L1-33B](https://huggingface.co/Sao10K/Zephyrus-L1-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T01:26:11.799613](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Zephyrus-L1-33B/blob/main/results_2023-10-04T01-26-11.799613.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5755752386680311,
"acc_stderr": 0.03433513463423277,
"acc_norm": 0.5794362999338591,
"acc_norm_stderr": 0.034311745756975556,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965824,
"mc2": 0.5386582093302932,
"mc2_stderr": 0.0153784519278671
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.014235872487909869,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094092
},
"harness|hellaswag|10": {
"acc": 0.6460864369647481,
"acc_stderr": 0.0047720549044044415,
"acc_norm": 0.8414658434574785,
"acc_norm_stderr": 0.003644946730044613
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5509433962264151,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.5509433962264151,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028435,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028435
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091707,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.01513338327898883,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.01513338327898883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895817,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429125,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429125
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02782610930728369,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02782610930728369
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.030042615832714867,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.030042615832714867
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829163,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829163
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965824,
"mc2": 0.5386582093302932,
"mc2_stderr": 0.0153784519278671
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
daoviedo03/categorias-11k | 2023-10-04T05:09:32.000Z | [
"region:us"
] | daoviedo03 | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-cd6e4381-146a-46d4-b6bc-2da1fb0e9854 | 2023-10-04T01:42:26.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_maximuslee07__llama-2-7b-rockwell-final | 2023-10-04T01:35:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of maximuslee07/llama-2-7b-rockwell-final
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maximuslee07/llama-2-7b-rockwell-final](https://huggingface.co/maximuslee07/llama-2-7b-rockwell-final)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maximuslee07__llama-2-7b-rockwell-final\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T01:33:36.813954](https://huggingface.co/datasets/open-llm-leaderboard/details_maximuslee07__llama-2-7b-rockwell-final/blob/main/results_2023-10-04T01-33-36.813954.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48145125617914153,\n\
\ \"acc_stderr\": 0.03517347709035305,\n \"acc_norm\": 0.48492967913914725,\n\
\ \"acc_norm_stderr\": 0.035159606355607595,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4721000283156327,\n\
\ \"mc2_stderr\": 0.016013403484864998\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4974402730375427,\n \"acc_stderr\": 0.014611199329843791,\n\
\ \"acc_norm\": 0.5273037542662116,\n \"acc_norm_stderr\": 0.014589589101985996\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6156144194383589,\n\
\ \"acc_stderr\": 0.004854555294017553,\n \"acc_norm\": 0.7909778928500298,\n\
\ \"acc_norm_stderr\": 0.00405779217189357\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502707,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.5096774193548387,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.0346488167501634,\n \"acc_norm\"\
: 0.6161616161616161,\n \"acc_norm_stderr\": 0.0346488167501634\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.02514180151117749,\n \
\ \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.02514180151117749\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6770642201834862,\n \"acc_stderr\": 0.020048115923415318,\n \"\
acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.020048115923415318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6568627450980392,\n \"acc_stderr\": 0.033321399446680854,\n \"\
acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.031450686007448596,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.031450686007448596\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.039265223787088445,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.039265223787088445\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.03057281131029961,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.03057281131029961\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n\
\ \"acc_stderr\": 0.01696703176641362,\n \"acc_norm\": 0.6577266922094508,\n\
\ \"acc_norm_stderr\": 0.01696703176641362\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.026915047355369818,\n\
\ \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.026915047355369818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.014149575348976266,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.014149575348976266\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n\
\ \"acc_stderr\": 0.02832032583010591,\n \"acc_norm\": 0.5369774919614148,\n\
\ \"acc_norm_stderr\": 0.02832032583010591\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n\
\ \"acc_stderr\": 0.0121614177297498,\n \"acc_norm\": 0.3474576271186441,\n\
\ \"acc_norm_stderr\": 0.0121614177297498\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004144,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004144\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4624183006535948,\n \"acc_stderr\": 0.020170614974969765,\n \
\ \"acc_norm\": 0.4624183006535948,\n \"acc_norm_stderr\": 0.020170614974969765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5428571428571428,\n \"acc_stderr\": 0.031891418324213966,\n\
\ \"acc_norm\": 0.5428571428571428,\n \"acc_norm_stderr\": 0.031891418324213966\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245231,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245231\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4721000283156327,\n\
\ \"mc2_stderr\": 0.016013403484864998\n }\n}\n```"
repo_url: https://huggingface.co/maximuslee07/llama-2-7b-rockwell-final
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-33-36.813954.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-33-36.813954.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-33-36.813954.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-33-36.813954.parquet'
- config_name: results
data_files:
- split: 2023_10_04T01_33_36.813954
path:
- results_2023-10-04T01-33-36.813954.parquet
- split: latest
path:
- results_2023-10-04T01-33-36.813954.parquet
---
# Dataset Card for Evaluation run of maximuslee07/llama-2-7b-rockwell-final
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/maximuslee07/llama-2-7b-rockwell-final
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [maximuslee07/llama-2-7b-rockwell-final](https://huggingface.co/maximuslee07/llama-2-7b-rockwell-final) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maximuslee07__llama-2-7b-rockwell-final",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T01:33:36.813954](https://huggingface.co/datasets/open-llm-leaderboard/details_maximuslee07__llama-2-7b-rockwell-final/blob/main/results_2023-10-04T01-33-36.813954.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48145125617914153,
"acc_stderr": 0.03517347709035305,
"acc_norm": 0.48492967913914725,
"acc_norm_stderr": 0.035159606355607595,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.4721000283156327,
"mc2_stderr": 0.016013403484864998
},
"harness|arc:challenge|25": {
"acc": 0.4974402730375427,
"acc_stderr": 0.014611199329843791,
"acc_norm": 0.5273037542662116,
"acc_norm_stderr": 0.014589589101985996
},
"harness|hellaswag|10": {
"acc": 0.6156144194383589,
"acc_stderr": 0.004854555294017553,
"acc_norm": 0.7909778928500298,
"acc_norm_stderr": 0.00405779217189357
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502707,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.0346488167501634,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.0346488167501634
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.02514180151117749,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.02514180151117749
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.020048115923415318,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.020048115923415318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.031450686007448596,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.031450686007448596
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.039265223787088445,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.039265223787088445
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.03057281131029961,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.03057281131029961
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.01696703176641362,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.01696703176641362
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.026915047355369818,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.026915047355369818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976266,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976266
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.02832032583010591,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.02832032583010591
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3474576271186441,
"acc_stderr": 0.0121614177297498,
"acc_norm": 0.3474576271186441,
"acc_norm_stderr": 0.0121614177297498
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004144,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004144
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4624183006535948,
"acc_stderr": 0.020170614974969765,
"acc_norm": 0.4624183006535948,
"acc_norm_stderr": 0.020170614974969765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5428571428571428,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.5428571428571428,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.03528211258245231,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.03528211258245231
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.4721000283156327,
"mc2_stderr": 0.016013403484864998
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mikasilvy/a333ef | 2023-10-04T01:42:07.000Z | [
"region:us"
] | mikasilvy | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-542425fc-297e-40d3-bd69-49e14ed951b1 | 2023-10-04T01:56:36.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_chickencaesar__llama2-platypus-llama2-chat-13B-hf | 2023-10-04T01:51:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chickencaesar/llama2-platypus-llama2-chat-13B-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chickencaesar/llama2-platypus-llama2-chat-13B-hf](https://huggingface.co/chickencaesar/llama2-platypus-llama2-chat-13B-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chickencaesar__llama2-platypus-llama2-chat-13B-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T01:49:37.697081](https://huggingface.co/datasets/open-llm-leaderboard/details_chickencaesar__llama2-platypus-llama2-chat-13B-hf/blob/main/results_2023-10-04T01-49-37.697081.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5695124494293777,\n\
\ \"acc_stderr\": 0.03434852343127591,\n \"acc_norm\": 0.574038327074039,\n\
\ \"acc_norm_stderr\": 0.03432422706760388,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.42933647624225024,\n\
\ \"mc2_stderr\": 0.014930954663935193\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182526,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6227843059151563,\n\
\ \"acc_stderr\": 0.0048369903732615694,\n \"acc_norm\": 0.8275243975303724,\n\
\ \"acc_norm_stderr\": 0.003770211859118942\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n \
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.0433643270799318,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.0433643270799318\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6548387096774193,\n \"acc_stderr\": 0.027045746573534327,\n \"\
acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.027045746573534327\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392902,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392902\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.015133383278988829,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.015133383278988829\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576274,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576274\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n\
\ \"acc_stderr\": 0.016145881256056215,\n \"acc_norm\": 0.36983240223463687,\n\
\ \"acc_norm_stderr\": 0.016145881256056215\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110303,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.02659678228769704,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.02659678228769704\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n\
\ \"acc_stderr\": 0.012737361318730581,\n \"acc_norm\": 0.4641460234680574,\n\
\ \"acc_norm_stderr\": 0.012737361318730581\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159703,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105935,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105935\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.42933647624225024,\n\
\ \"mc2_stderr\": 0.014930954663935193\n }\n}\n```"
repo_url: https://huggingface.co/chickencaesar/llama2-platypus-llama2-chat-13B-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-49-37.697081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-49-37.697081.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-49-37.697081.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-49-37.697081.parquet'
- config_name: results
data_files:
- split: 2023_10_04T01_49_37.697081
path:
- results_2023-10-04T01-49-37.697081.parquet
- split: latest
path:
- results_2023-10-04T01-49-37.697081.parquet
---
# Dataset Card for Evaluation run of chickencaesar/llama2-platypus-llama2-chat-13B-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chickencaesar/llama2-platypus-llama2-chat-13B-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chickencaesar/llama2-platypus-llama2-chat-13B-hf](https://huggingface.co/chickencaesar/llama2-platypus-llama2-chat-13B-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chickencaesar__llama2-platypus-llama2-chat-13B-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T01:49:37.697081](https://huggingface.co/datasets/open-llm-leaderboard/details_chickencaesar__llama2-platypus-llama2-chat-13B-hf/blob/main/results_2023-10-04T01-49-37.697081.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5695124494293777,
"acc_stderr": 0.03434852343127591,
"acc_norm": 0.574038327074039,
"acc_norm_stderr": 0.03432422706760388,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.42933647624225024,
"mc2_stderr": 0.014930954663935193
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182526,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6227843059151563,
"acc_stderr": 0.0048369903732615694,
"acc_norm": 0.8275243975303724,
"acc_norm_stderr": 0.003770211859118942
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.0433643270799318,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.0433643270799318
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178816,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178816
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392902,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392902
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988829,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.026033890613576274,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.026033890613576274
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.016145881256056215,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.016145881256056215
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110303,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.02659678228769704,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.02659678228769704
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730581,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730581
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105935,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105935
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.42933647624225024,
"mc2_stderr": 0.014930954663935193
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-5c14c387-49e5-4833-b08f-1b1b67a8fa30 | 2023-10-04T02:08:59.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Enno-Ai__ennodata-13b-8bit-raw-15epoch | 2023-10-04T01:57:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Enno-Ai/ennodata-13b-8bit-raw-15epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Enno-Ai/ennodata-13b-8bit-raw-15epoch](https://huggingface.co/Enno-Ai/ennodata-13b-8bit-raw-15epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Enno-Ai__ennodata-13b-8bit-raw-15epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T01:56:25.933600](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__ennodata-13b-8bit-raw-15epoch/blob/main/results_2023-10-04T01-56-25.933600.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5764175560875521,\n\
\ \"acc_stderr\": 0.03451333662706103,\n \"acc_norm\": 0.5803208130129659,\n\
\ \"acc_norm_stderr\": 0.03449283313310436,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214294,\n \"mc2\": 0.5358311103869008,\n\
\ \"mc2_stderr\": 0.01569764342795165\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.01440561827943617,\n\
\ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6241784505078669,\n\
\ \"acc_stderr\": 0.00483344455633862,\n \"acc_norm\": 0.8220474009161521,\n\
\ \"acc_norm_stderr\": 0.00381691171167917\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6258064516129033,\n \"acc_stderr\": 0.027528904299845704,\n \"\
acc_norm\": 0.6258064516129033,\n \"acc_norm_stderr\": 0.027528904299845704\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419873,\n \"\
acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419873\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.02504919787604234,\n \
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.02504919787604234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588674,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588674\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891823,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946022,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946022\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48379888268156424,\n\
\ \"acc_stderr\": 0.016713720729501013,\n \"acc_norm\": 0.48379888268156424,\n\
\ \"acc_norm_stderr\": 0.016713720729501013\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488547,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488547\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.026289734945952926,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.026289734945952926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n\
\ \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n\
\ \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214294,\n \"mc2\": 0.5358311103869008,\n\
\ \"mc2_stderr\": 0.01569764342795165\n }\n}\n```"
repo_url: https://huggingface.co/Enno-Ai/ennodata-13b-8bit-raw-15epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-56-25.933600.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T01-56-25.933600.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-56-25.933600.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T01-56-25.933600.parquet'
- config_name: results
data_files:
- split: 2023_10_04T01_56_25.933600
path:
- results_2023-10-04T01-56-25.933600.parquet
- split: latest
path:
- results_2023-10-04T01-56-25.933600.parquet
---
# Dataset Card for Evaluation run of Enno-Ai/ennodata-13b-8bit-raw-15epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Enno-Ai/ennodata-13b-8bit-raw-15epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Enno-Ai/ennodata-13b-8bit-raw-15epoch](https://huggingface.co/Enno-Ai/ennodata-13b-8bit-raw-15epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Enno-Ai__ennodata-13b-8bit-raw-15epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T01:56:25.933600](https://huggingface.co/datasets/open-llm-leaderboard/details_Enno-Ai__ennodata-13b-8bit-raw-15epoch/blob/main/results_2023-10-04T01-56-25.933600.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5764175560875521,
"acc_stderr": 0.03451333662706103,
"acc_norm": 0.5803208130129659,
"acc_norm_stderr": 0.03449283313310436,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214294,
"mc2": 0.5358311103869008,
"mc2_stderr": 0.01569764342795165
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.01440561827943617,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.01421244498065189
},
"harness|hellaswag|10": {
"acc": 0.6241784505078669,
"acc_stderr": 0.00483344455633862,
"acc_norm": 0.8220474009161521,
"acc_norm_stderr": 0.00381691171167917
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6258064516129033,
"acc_stderr": 0.027528904299845704,
"acc_norm": 0.6258064516129033,
"acc_norm_stderr": 0.027528904299845704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419873,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419873
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588674,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891823,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891823
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946022,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946022
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48379888268156424,
"acc_stderr": 0.016713720729501013,
"acc_norm": 0.48379888268156424,
"acc_norm_stderr": 0.016713720729501013
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488547,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488547
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.026289734945952926,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.026289734945952926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829156,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829156
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214294,
"mc2": 0.5358311103869008,
"mc2_stderr": 0.01569764342795165
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Bluckr/function-calling-assistant-spanish-pofi-v2 | 2023-10-04T02:08:40.000Z | [
"license:other",
"region:us"
] | Bluckr | null | null | null | 1 | 0 | ---
license: other
license_name: uso-libre
license_link: LICENSE
---
|
atom-in-the-universe/bild-41faefeb-e4ee-4f54-b626-da884bef0899 | 2023-10-04T02:21:38.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_posicube__Llama-chat-AY-13B | 2023-10-04T02:17:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of posicube/Llama-chat-AY-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [posicube/Llama-chat-AY-13B](https://huggingface.co/posicube/Llama-chat-AY-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_posicube__Llama-chat-AY-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T02:16:36.083173](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama-chat-AY-13B/blob/main/results_2023-10-04T02-16-36.083173.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6007120326942734,\n\
\ \"acc_stderr\": 0.03386123886705212,\n \"acc_norm\": 0.6045415235420318,\n\
\ \"acc_norm_stderr\": 0.03383962263956823,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5594660039594063,\n\
\ \"mc2_stderr\": 0.015735735817854812\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578276,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6362278430591516,\n\
\ \"acc_stderr\": 0.004801009657690438,\n \"acc_norm\": 0.8323043218482374,\n\
\ \"acc_norm_stderr\": 0.003728322968874899\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981762,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981762\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667768,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667768\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878944,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878944\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.017493922404112648,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.017493922404112648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643524,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\
\ \"acc_stderr\": 0.014551310568143697,\n \"acc_norm\": 0.7905491698595147,\n\
\ \"acc_norm_stderr\": 0.014551310568143697\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400172,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4893854748603352,\n\
\ \"acc_stderr\": 0.016718732941192097,\n \"acc_norm\": 0.4893854748603352,\n\
\ \"acc_norm_stderr\": 0.016718732941192097\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5980392156862745,\n \"acc_stderr\": 0.019835176484375383,\n \
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.019835176484375383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5594660039594063,\n\
\ \"mc2_stderr\": 0.015735735817854812\n }\n}\n```"
repo_url: https://huggingface.co/posicube/Llama-chat-AY-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-16-36.083173.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-16-36.083173.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-16-36.083173.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_16_36.083173
path:
- results_2023-10-04T02-16-36.083173.parquet
- split: latest
path:
- results_2023-10-04T02-16-36.083173.parquet
---
# Dataset Card for Evaluation run of posicube/Llama-chat-AY-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/posicube/Llama-chat-AY-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [posicube/Llama-chat-AY-13B](https://huggingface.co/posicube/Llama-chat-AY-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_posicube__Llama-chat-AY-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T02:16:36.083173](https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama-chat-AY-13B/blob/main/results_2023-10-04T02-16-36.083173.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6007120326942734,
"acc_stderr": 0.03386123886705212,
"acc_norm": 0.6045415235420318,
"acc_norm_stderr": 0.03383962263956823,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5594660039594063,
"mc2_stderr": 0.015735735817854812
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578276,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6362278430591516,
"acc_stderr": 0.004801009657690438,
"acc_norm": 0.8323043218482374,
"acc_norm_stderr": 0.003728322968874899
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981762,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981762
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878944,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878944
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712996,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712996
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.017493922404112648,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.017493922404112648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643524,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560403,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143697,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143697
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400172,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4893854748603352,
"acc_stderr": 0.016718732941192097,
"acc_norm": 0.4893854748603352,
"acc_norm_stderr": 0.016718732941192097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.019835176484375383,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.019835176484375383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5594660039594063,
"mc2_stderr": 0.015735735817854812
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Brian039/ADL_HW1 | 2023-10-04T02:18:35.000Z | [
"region:us"
] | Brian039 | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-6bc93b75-b412-40da-b060-6cda0639759f | 2023-10-04T02:36:52.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
DynamicSuperb/Dummy_Created_by_Attacker | 2023-10-04T03:13:08.000Z | [
"license:apache-2.0",
"region:us"
] | DynamicSuperb | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
marasama/nva-misya | 2023-10-04T02:23:12.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_gradientputri__MegaMix-T1-13B | 2023-10-04T02:24:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gradientputri/MegaMix-T1-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gradientputri/MegaMix-T1-13B](https://huggingface.co/gradientputri/MegaMix-T1-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gradientputri__MegaMix-T1-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T02:23:36.963949](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientputri__MegaMix-T1-13B/blob/main/results_2023-10-04T02-23-36.963949.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5859017916951168,\n\
\ \"acc_stderr\": 0.033921662504131096,\n \"acc_norm\": 0.5896331581926526,\n\
\ \"acc_norm_stderr\": 0.03390063768124179,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.4819184208256549,\n\
\ \"mc2_stderr\": 0.015201987620383025\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910474\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6424019119697272,\n\
\ \"acc_stderr\": 0.004783133725599499,\n \"acc_norm\": 0.8343955387373033,\n\
\ \"acc_norm_stderr\": 0.003709654977628477\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940784,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7779816513761468,\n\
\ \"acc_stderr\": 0.017818849564796634,\n \"acc_norm\": 0.7779816513761468,\n\
\ \"acc_norm_stderr\": 0.017818849564796634\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n\
\ \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389087,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389087\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545857,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545857\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n\
\ \"acc_stderr\": 0.014774358319934492,\n \"acc_norm\": 0.7816091954022989,\n\
\ \"acc_norm_stderr\": 0.014774358319934492\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584194,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584194\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49162011173184356,\n\
\ \"acc_stderr\": 0.01672015279467255,\n \"acc_norm\": 0.49162011173184356,\n\
\ \"acc_norm_stderr\": 0.01672015279467255\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037103,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558555,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.012700582404768221,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.012700582404768221\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777515,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.4819184208256549,\n\
\ \"mc2_stderr\": 0.015201987620383025\n }\n}\n```"
repo_url: https://huggingface.co/gradientputri/MegaMix-T1-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-23-36.963949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-23-36.963949.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-23-36.963949.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-23-36.963949.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_23_36.963949
path:
- results_2023-10-04T02-23-36.963949.parquet
- split: latest
path:
- results_2023-10-04T02-23-36.963949.parquet
---
# Dataset Card for Evaluation run of gradientputri/MegaMix-T1-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gradientputri/MegaMix-T1-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gradientputri/MegaMix-T1-13B](https://huggingface.co/gradientputri/MegaMix-T1-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gradientputri__MegaMix-T1-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T02:23:36.963949](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientputri__MegaMix-T1-13B/blob/main/results_2023-10-04T02-23-36.963949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5859017916951168,
"acc_stderr": 0.033921662504131096,
"acc_norm": 0.5896331581926526,
"acc_norm_stderr": 0.03390063768124179,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.4819184208256549,
"mc2_stderr": 0.015201987620383025
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.014230084761910474
},
"harness|hellaswag|10": {
"acc": 0.6424019119697272,
"acc_stderr": 0.004783133725599499,
"acc_norm": 0.8343955387373033,
"acc_norm_stderr": 0.003709654977628477
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562417,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940784,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796634,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389087,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389087
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545857,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545857
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934492,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934492
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584194,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584194
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49162011173184356,
"acc_stderr": 0.01672015279467255,
"acc_norm": 0.49162011173184356,
"acc_norm_stderr": 0.01672015279467255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037103,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558555,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768221,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125474,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777515,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.4819184208256549,
"mc2_stderr": 0.015201987620383025
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mikasilvy/jkiuu | 2023-10-04T02:25:19.000Z | [
"region:us"
] | mikasilvy | null | null | null | 0 | 0 | Entry not found |
weaviate/WithoutRetrieval-SchemaSplit-Train-80 | 2023-10-04T02:28:59.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_yeen214__test_llama2_7b | 2023-10-04T02:29:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeen214/test_llama2_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeen214/test_llama2_7b](https://huggingface.co/yeen214/test_llama2_7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeen214__test_llama2_7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T02:28:22.719592](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__test_llama2_7b/blob/main/results_2023-10-04T02-28-22.719592.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.471008753299703,\n\
\ \"acc_stderr\": 0.03528088196519964,\n \"acc_norm\": 0.4749886536723232,\n\
\ \"acc_norm_stderr\": 0.035266604173246285,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5884285998805019,\n\
\ \"acc_stderr\": 0.0049111251010646425,\n \"acc_norm\": 0.785700059749054,\n\
\ \"acc_norm_stderr\": 0.004094971980892084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187899,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187899\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729555,\n \"\
acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729555\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5343137254901961,\n \"acc_stderr\": 0.03501038327635897,\n \"\
acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.03501038327635897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3624511082138201,\n\
\ \"acc_stderr\": 0.01227751253325248,\n \"acc_norm\": 0.3624511082138201,\n\
\ \"acc_norm_stderr\": 0.01227751253325248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4857142857142857,\n \"acc_stderr\": 0.03199615232806287,\n\
\ \"acc_norm\": 0.4857142857142857,\n \"acc_norm_stderr\": 0.03199615232806287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n }\n}\n```"
repo_url: https://huggingface.co/yeen214/test_llama2_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-28-22.719592.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-28-22.719592.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-28-22.719592.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_28_22.719592
path:
- results_2023-10-04T02-28-22.719592.parquet
- split: latest
path:
- results_2023-10-04T02-28-22.719592.parquet
---
# Dataset Card for Evaluation run of yeen214/test_llama2_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeen214/test_llama2_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeen214/test_llama2_7b](https://huggingface.co/yeen214/test_llama2_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeen214__test_llama2_7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T02:28:22.719592](https://huggingface.co/datasets/open-llm-leaderboard/details_yeen214__test_llama2_7b/blob/main/results_2023-10-04T02-28-22.719592.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.471008753299703,
"acc_stderr": 0.03528088196519964,
"acc_norm": 0.4749886536723232,
"acc_norm_stderr": 0.035266604173246285,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5884285998805019,
"acc_stderr": 0.0049111251010646425,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.004094971980892084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187899,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187899
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6311926605504588,
"acc_stderr": 0.020686227560729555,
"acc_norm": 0.6311926605504588,
"acc_norm_stderr": 0.020686227560729555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.03501038327635897,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.03501038327635897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3624511082138201,
"acc_stderr": 0.01227751253325248,
"acc_norm": 0.3624511082138201,
"acc_norm_stderr": 0.01227751253325248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4857142857142857,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.4857142857142857,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
weaviate/WithoutRetrieval-SchemaSplit-Test-80 | 2023-10-04T02:29:43.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_gradientputri__MegaMix-A1-13B | 2023-10-04T02:31:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gradientputri/MegaMix-A1-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gradientputri/MegaMix-A1-13B](https://huggingface.co/gradientputri/MegaMix-A1-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gradientputri__MegaMix-A1-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T02:29:44.126767](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientputri__MegaMix-A1-13B/blob/main/results_2023-10-04T02-29-44.126767.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5837378880317159,\n\
\ \"acc_stderr\": 0.03398025722162526,\n \"acc_norm\": 0.5874514171422335,\n\
\ \"acc_norm_stderr\": 0.03395919148519379,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.4747758723075701,\n\
\ \"mc2_stderr\": 0.015144198126877175\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522082,\n\
\ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6430989842660825,\n\
\ \"acc_stderr\": 0.004781061390873914,\n \"acc_norm\": 0.8348934475204143,\n\
\ \"acc_norm_stderr\": 0.0037051790292873302\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286644,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286644\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518028,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518028\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35978835978835977,\n \"acc_stderr\": 0.024718075944129277,\n \"\
acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.024718075944129277\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6774193548387096,\n \"acc_stderr\": 0.02659308451657226,\n \"\
acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.02659308451657226\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710855,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710855\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371153,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49162011173184356,\n\
\ \"acc_stderr\": 0.01672015279467255,\n \"acc_norm\": 0.49162011173184356,\n\
\ \"acc_norm_stderr\": 0.01672015279467255\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364805,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364805\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100786,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100786\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657114,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657114\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5964052287581699,\n \"acc_stderr\": 0.019848280168401157,\n \
\ \"acc_norm\": 0.5964052287581699,\n \"acc_norm_stderr\": 0.019848280168401157\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.4747758723075701,\n\
\ \"mc2_stderr\": 0.015144198126877175\n }\n}\n```"
repo_url: https://huggingface.co/gradientputri/MegaMix-A1-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-29-44.126767.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-29-44.126767.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-29-44.126767.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-29-44.126767.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_29_44.126767
path:
- results_2023-10-04T02-29-44.126767.parquet
- split: latest
path:
- results_2023-10-04T02-29-44.126767.parquet
---
# Dataset Card for Evaluation run of gradientputri/MegaMix-A1-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gradientputri/MegaMix-A1-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gradientputri/MegaMix-A1-13B](https://huggingface.co/gradientputri/MegaMix-A1-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gradientputri__MegaMix-A1-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T02:29:44.126767](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientputri__MegaMix-A1-13B/blob/main/results_2023-10-04T02-29-44.126767.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5837378880317159,
"acc_stderr": 0.03398025722162526,
"acc_norm": 0.5874514171422335,
"acc_norm_stderr": 0.03395919148519379,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.4747758723075701,
"mc2_stderr": 0.015144198126877175
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522082,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.01421244498065189
},
"harness|hellaswag|10": {
"acc": 0.6430989842660825,
"acc_stderr": 0.004781061390873914,
"acc_norm": 0.8348934475204143,
"acc_norm_stderr": 0.0037051790292873302
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286644,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286644
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518028,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518028
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35978835978835977,
"acc_stderr": 0.024718075944129277,
"acc_norm": 0.35978835978835977,
"acc_norm_stderr": 0.024718075944129277
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.02659308451657226,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.02659308451657226
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710855,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710855
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371153,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49162011173184356,
"acc_stderr": 0.01672015279467255,
"acc_norm": 0.49162011173184356,
"acc_norm_stderr": 0.01672015279467255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364805,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364805
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100786,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100786
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657114,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657114
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5964052287581699,
"acc_stderr": 0.019848280168401157,
"acc_norm": 0.5964052287581699,
"acc_norm_stderr": 0.019848280168401157
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.4747758723075701,
"mc2_stderr": 0.015144198126877175
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
weaviate/WithoutRetrieval-SchemaSplit-Train-40 | 2023-10-04T02:30:25.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithoutRetrieval-SchemaSplit-Test-40 | 2023-10-04T02:31:20.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithoutRetrieval-SchemaSplit-Train-20 | 2023-10-04T02:32:21.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
weaviate/WithoutRetrieval-SchemaSplit-Test-20 | 2023-10-04T02:33:30.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_gradientputri__MegaMix-S1-13B | 2023-10-04T02:37:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gradientputri/MegaMix-S1-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gradientputri/MegaMix-S1-13B](https://huggingface.co/gradientputri/MegaMix-S1-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gradientputri__MegaMix-S1-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T02:36:32.129968](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientputri__MegaMix-S1-13B/blob/main/results_2023-10-04T02-36-32.129968.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5800546585628982,\n\
\ \"acc_stderr\": 0.034094142771390626,\n \"acc_norm\": 0.5839863406735987,\n\
\ \"acc_norm_stderr\": 0.03407161446639643,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.445215917726999,\n\
\ \"mc2_stderr\": 0.014838089526207054\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225402,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6412069308902609,\n\
\ \"acc_stderr\": 0.004786660691181916,\n \"acc_norm\": 0.8364867556263692,\n\
\ \"acc_norm_stderr\": 0.0036907745636380034\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.040089737857792046,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.040089737857792046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572264,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243739,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243739\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946005,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4893854748603352,\n\
\ \"acc_stderr\": 0.0167187329411921,\n \"acc_norm\": 0.4893854748603352,\n\
\ \"acc_norm_stderr\": 0.0167187329411921\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983964,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983964\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5898692810457516,\n \"acc_stderr\": 0.019898412717635906,\n \
\ \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.019898412717635906\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.445215917726999,\n\
\ \"mc2_stderr\": 0.014838089526207054\n }\n}\n```"
repo_url: https://huggingface.co/gradientputri/MegaMix-S1-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-36-32.129968.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-36-32.129968.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-36-32.129968.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-36-32.129968.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_36_32.129968
path:
- results_2023-10-04T02-36-32.129968.parquet
- split: latest
path:
- results_2023-10-04T02-36-32.129968.parquet
---
# Dataset Card for Evaluation run of gradientputri/MegaMix-S1-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gradientputri/MegaMix-S1-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gradientputri/MegaMix-S1-13B](https://huggingface.co/gradientputri/MegaMix-S1-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gradientputri__MegaMix-S1-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T02:36:32.129968](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientputri__MegaMix-S1-13B/blob/main/results_2023-10-04T02-36-32.129968.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5800546585628982,
"acc_stderr": 0.034094142771390626,
"acc_norm": 0.5839863406735987,
"acc_norm_stderr": 0.03407161446639643,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.445215917726999,
"mc2_stderr": 0.014838089526207054
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225402,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6412069308902609,
"acc_stderr": 0.004786660691181916,
"acc_norm": 0.8364867556263692,
"acc_norm_stderr": 0.0036907745636380034
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.040089737857792046,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.040089737857792046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572264,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243739,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243739
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946005,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531018,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531018
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4893854748603352,
"acc_stderr": 0.0167187329411921,
"acc_norm": 0.4893854748603352,
"acc_norm_stderr": 0.0167187329411921
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983964,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983964
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.019898412717635906,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.019898412717635906
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.445215917726999,
"mc2_stderr": 0.014838089526207054
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atom-in-the-universe/bild-8cb6e403-fdc9-4c10-802c-708bb5b7d627 | 2023-10-04T02:49:48.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_FINDA-FIT__llama-r | 2023-10-04T02:40:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FINDA-FIT/llama-r
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FINDA-FIT/llama-r](https://huggingface.co/FINDA-FIT/llama-r) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FINDA-FIT__llama-r\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T02:39:08.372794](https://huggingface.co/datasets/open-llm-leaderboard/details_FINDA-FIT__llama-r/blob/main/results_2023-10-04T02-39-08.372794.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26015680684073617,\n\
\ \"acc_stderr\": 0.031642820686305716,\n \"acc_norm\": 0.2612080605748792,\n\
\ \"acc_norm_stderr\": 0.03166036748428084,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476197,\n \"mc2\": 0.453779147635165,\n\
\ \"mc2_stderr\": 0.015616746455061977\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.17406143344709898,\n \"acc_stderr\": 0.011080177129482206,\n\
\ \"acc_norm\": 0.2158703071672355,\n \"acc_norm_stderr\": 0.01202297536003066\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28161720772754434,\n\
\ \"acc_stderr\": 0.004488684397979514,\n \"acc_norm\": 0.3018323043218482,\n\
\ \"acc_norm_stderr\": 0.004581147247963204\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.03861229196653697,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.03861229196653697\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173042,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173042\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.026450874489042774,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.026450874489042774\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070643,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070643\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.023234581088428498,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.023234581088428498\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28990825688073396,\n \"acc_stderr\": 0.019453066609201597,\n \"\
acc_norm\": 0.28990825688073396,\n \"acc_norm_stderr\": 0.019453066609201597\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159274,\n \
\ \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159274\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.041733491480834994,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.041733491480834994\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.04750458399041692,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.04750458399041692\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395967,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395967\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.02249723019096755,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.02249723019096755\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.023891879541959603,\n\
\ \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.023891879541959603\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432403,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432403\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23468057366362452,\n\
\ \"acc_stderr\": 0.010824026872449356,\n \"acc_norm\": 0.23468057366362452,\n\
\ \"acc_norm_stderr\": 0.010824026872449356\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.17272727272727273,\n\
\ \"acc_stderr\": 0.03620691833929219,\n \"acc_norm\": 0.17272727272727273,\n\
\ \"acc_norm_stderr\": 0.03620691833929219\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407314,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789427,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789427\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476197,\n \"mc2\": 0.453779147635165,\n\
\ \"mc2_stderr\": 0.015616746455061977\n }\n}\n```"
repo_url: https://huggingface.co/FINDA-FIT/llama-r
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-39-08.372794.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T02-39-08.372794.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-39-08.372794.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T02-39-08.372794.parquet'
- config_name: results
data_files:
- split: 2023_10_04T02_39_08.372794
path:
- results_2023-10-04T02-39-08.372794.parquet
- split: latest
path:
- results_2023-10-04T02-39-08.372794.parquet
---
# Dataset Card for Evaluation run of FINDA-FIT/llama-r
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FINDA-FIT/llama-r
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FINDA-FIT/llama-r](https://huggingface.co/FINDA-FIT/llama-r) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FINDA-FIT__llama-r",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T02:39:08.372794](https://huggingface.co/datasets/open-llm-leaderboard/details_FINDA-FIT__llama-r/blob/main/results_2023-10-04T02-39-08.372794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26015680684073617,
"acc_stderr": 0.031642820686305716,
"acc_norm": 0.2612080605748792,
"acc_norm_stderr": 0.03166036748428084,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476197,
"mc2": 0.453779147635165,
"mc2_stderr": 0.015616746455061977
},
"harness|arc:challenge|25": {
"acc": 0.17406143344709898,
"acc_stderr": 0.011080177129482206,
"acc_norm": 0.2158703071672355,
"acc_norm_stderr": 0.01202297536003066
},
"harness|hellaswag|10": {
"acc": 0.28161720772754434,
"acc_stderr": 0.004488684397979514,
"acc_norm": 0.3018323043218482,
"acc_norm_stderr": 0.004581147247963204
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173042,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173042
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.026450874489042774,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.026450874489042774
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070643,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070643
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3,
"acc_stderr": 0.023234581088428498,
"acc_norm": 0.3,
"acc_norm_stderr": 0.023234581088428498
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28990825688073396,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.28990825688073396,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159274,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159274
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.041733491480834994,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.041733491480834994
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.04750458399041692,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.04750458399041692
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395967,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395967
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.02249723019096755,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.02249723019096755
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.023891879541959603,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.023891879541959603
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432403,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432403
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23468057366362452,
"acc_stderr": 0.010824026872449356,
"acc_norm": 0.23468057366362452,
"acc_norm_stderr": 0.010824026872449356
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.17272727272727273,
"acc_stderr": 0.03620691833929219,
"acc_norm": 0.17272727272727273,
"acc_norm_stderr": 0.03620691833929219
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407314,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789427,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789427
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476197,
"mc2": 0.453779147635165,
"mc2_stderr": 0.015616746455061977
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
weaviate/WithRetrieval-APISplit-Train-80 | 2023-10-04T02:47:03.000Z | [
"license:apache-2.0",
"region:us"
] | weaviate | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.